Technical Program

Paper Detail

Paper Title Profile-based Privacy for Locally Private Computations
Paper IdentifierMO4.R3.1
Authors Joseph Donald Geumlek, Kamalika Chaudhuri, University of California, San Diego, United States
Session Privacy
Location Monge, Level 3
Session Time Monday, 08 July, 16:40 - 18:00
Presentation Time Monday, 08 July, 16:40 - 17:00
Manuscript  Click here to download the manuscript
Abstract Differential privacy has emerged as a gold standard in privacy-preserving data analysis. A popular variant is local differential privacy, where the data holder is the trusted curator. A major barrier, however, towards a wider adoption of this model is that it offers a poor privacy-utility tradeoff. In this work, we address this problem by introducing a new variant of local privacy called profile-based privacy. The central idea is that the problem setting comes with a graph G of data generating distributions, whose edges encode sensitive pairs of distributions that should be made indistinguishable. This provides higher utility because unlike local differential privacy, we no longer need to make every pair of private values in the domain indistinguishable, and instead only protect the identity of the underlying distribution. We establish privacy properties of the profile-based privacy definition, such as post-processing invariance and graceful composition. Finally, we provide mechanisms that are private in this framework, and show via simulations that they achieve higher utility than the corresponding local differential privacy mechanisms.