Results 231 to 240 of about 164,354 (261)
Some of the next articles are maybe not open access.

Towards Measuring Fairness for Local Differential Privacy

2023
Local differential privacy (LDP) approaches provide data subjects with the strong privacy guarantees of Differential Privacy under the scenario of untrusted data curators. They are used by companies (e.g., Google’s RAPPOR) to collect potentially sensitive data from clients through randomized response.
Salas, Julián   +2 more
openaire   +2 more sources

Collecting Preference Rankings Under Local Differential Privacy

2019 IEEE 35th International Conference on Data Engineering (ICDE), 2019
In this paper, we initiate the study of collecting preference rankings under local differential privacy. The key technical challenge comes from the fact that the number of possible rankings increases factorially in the number of items to rank. In practical settings, this number could be large, leading to excessive injected noise. To solve this problem,
Xiang Cheng   +5 more
openaire   +1 more source

Randomized requantization with local differential privacy

2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2016
In this paper we study how individual sensors can compress their observations in a privacy-preserving manner. We propose a randomized requantization scheme that guarantees local differential privacy, a strong model for privacy in which individual data holders must mask their information before sending it to an untrusted third party.
Sijie Xiong   +2 more
openaire   +1 more source

Multiple Privacy Regimes Mechanism for Local Differential Privacy

2019
Local differential privacy (LDP), as a state-of-the-art privacy notion, enables users to share protected data safely while the private real data never leaves user’s device. The privacy regime is one of the critical parameters balancing between the correctness of the statistical result and the level of user’s privacy.
Yutong Ye   +4 more
openaire   +1 more source

Local Differential Privacy for Data Streams

2020
The dynamic change, huge data size, and complex structure of the data stream have made it very difficult to be analyzed and protected in real-time. Traditional privacy protection models such as differential privacy which need to rely on the trusted servers or companies, and this will increase the uncertainty of protecting streaming privacy.
Xianjin Fang, Qingkui Zeng, Gaoming Yang
openaire   +1 more source

Differential Privacy in the Local Setting

Proceedings of the Fourth ACM International Workshop on Security and Privacy Analytics, 2018
Differential privacy has been increasingly accepted as the de facto standard for data privacy in the research community. While many algorithms have been developed for data publishing and analysis satisfying differential privacy, there have been few deployment of such techniques.
openaire   +1 more source

Multidisciplinary standards of care and recent progress in pancreatic ductal adenocarcinoma

Ca-A Cancer Journal for Clinicians, 2020
Aaron J Grossberg   +2 more
exaly  

Cancer statistics in China, 2015

Ca-A Cancer Journal for Clinicians, 2016
Rongshou Zheng   +2 more
exaly  

Local Cancer Recurrence: The Realities, Challenges, and Opportunities for New Therapies

Ca-A Cancer Journal for Clinicians, 2018
David A Mahvi   +2 more
exaly  

Home - About - Disclaimer - Privacy