Abstract | ||
---|---|---|
We study a basic private estimation problem: each of n users draws a single i.i.d. sample from an unknown Gaussian distribution N(mu, sigma(2)), and the goal is to estimate mu while guaranteeing local differential privacy for each user. As minimizing the number of rounds of interaction is important in the local setting, we provide adaptive two-round solutions and nonadaptive one-round solutions to this problem. We match these upper bounds with an information-theoretic lower bound showing that our accuracy guarantees are tight up to logarithmic factors for all sequentially interactive locally private protocols. |
Year | Venue | Keywords |
---|---|---|
2018 | ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019) | local differential privacy |
Field | DocType | Volume |
Mathematical optimization,Data domain,Differential privacy,Upper and lower bounds,Gaussian,Logarithm,Mathematics | Journal | 32 |
ISSN | Citations | PageRank |
1049-5258 | 0 | 0.34 |
References | Authors | |
9 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Joseph, Matthew | 1 | 49 | 3.77 |
Janardhan Kulkarni | 2 | 153 | 17.73 |
Jieming Mao | 3 | 54 | 9.19 |
Zhiwei Steven Wu | 4 | 157 | 30.92 |