Title
Differentially Private Gaussian Processes.
Abstract
A major challenge for machine learning is increasing the availability of data while respecting the privacy of individuals. Here we combine the provable privacy guarantees of the Differential Privacy framework with the flexibility of Gaussian processes (GPs). We propose a method using GPs to provide Differentially Private (DP) regression. We then improve this method by crafting the DP noise covariance structure to efficiently protect the training data, while minimising the scale of the added noise. We find that, for the dataset used, this cloaking method achieves the greatest accuracy, while still providing privacy guarantees, and offers practical DP for regression over multi-dimensional inputs. Together these methods provide a starter toolkit for combining differential privacy and GPs.
Year
Venue
Field
2016
arXiv: Machine Learning
Training set,Data mining,Cloaking,Regression,Differential privacy,Computer science,Artificial intelligence,Global Positioning System,Gaussian process,Machine learning,Covariance
DocType
Volume
Citations 
Journal
abs/1606.00720
0
PageRank 
References 
Authors
0.34
11
3
Name
Order
Citations
PageRank
Michael Thomas Smith100.68
Max Zwiessele221.05
Neil D. Lawrence33411268.51