Title
A near-optimal algorithm for differentially-private principal components
Abstract
The principal components analysis (PCA) algorithm is a standard tool for identifying good low-dimensional approximations to high-dimensional data. Many data sets of interest contain private or sensitive information about individuals. Algorithms which operate on such data should be sensitive to the privacy risks in publishing their outputs. Differential privacy is a framework for developing tradeoffs between privacy and the utility of these outputs. In this paper we investigate the theory and empirical performance of differentially private approximations to PCA and propose a new method which explicitly optimizes the utility of the output. We show that the sample complexity of the proposed method differs from the existing procedure in the scaling with the data dimension, and that our method is nearly optimal in terms of this scaling. We furthermore illustrate our results, showing that on real data there is a large performance gap between the existing method and our method.
Year
DOI
Venue
2013
10.5555/2567709.2567754
Journal of Machine Learning Research
Keywords
Field
DocType
existing method,differentially-private principal component,differentially private approximation,differential privacy,new method,empirical performance,near-optimal algorithm,data dimension,privacy risk,principal components analysis,dimension reduction
Data mining,Data set,Dimensionality reduction,Computer science,Artificial intelligence,Information sensitivity,Scaling,Differential privacy,Algorithm,Sample complexity,Principal component analysis,Machine learning,Performance gap
Journal
Volume
Issue
ISSN
14
Issue-in-Progress
1532-4435
Citations 
PageRank 
References 
23
1.12
48
Authors
3
Name
Order
Citations
PageRank
Kamalika Chaudhuri1150396.90
Anand D. Sarwate261547.82
Kaushik Sinha324417.81