Title
A new analysis of differential privacy’s generalization guarantees (invited paper)
Abstract
ABSTRACTWe give a new proof of the "transfer theorem" underlying adaptive data analysis: that any mechanism for answering adaptively chosen statistical queries that is differentially private and sample-accurate is also accurate out-of-sample. Our new proof is elementary and gives structural insights that we expect will be useful elsewhere. We show: 1) that differential privacy ensures that the expectation of any query on the conditional distribution on datasets induced by the transcript of the interaction is close to its true value on the data distribution, and 2) sample accuracy on its own ensures that any query answer produced by the mechanism is close to its conditional expectation with high probability. This second claim follows from a thought experiment in which we imagine that the dataset is resampled from the conditional distribution after the mechanism has committed to its answers. The transfer theorem then follows by summing these two bounds. An upshot of our new proof technique is that the concrete bounds we obtain are substantially better than the best previously known bounds.
Year
DOI
Venue
2021
10.1145/3406325.3465358
ACM Symposium on Theory of Computing
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Christopher Jung1124.83
Katrina Ligett292366.19
Seth Neel3527.86
Aaron Roth41937110.48
Saeed Sharifi -Malvajerdi513.74
Moshe Shenfeld601.35