Title
Differentially Private Knowledge Distillation for Mobile Analytics
Abstract
The increasing demand for on-device deep learning necessitates the deployment of deep models on mobile devices. However, directly deploying deep models on mobile devices presents both capacity bottleneck and prohibitive privacy risk. To address these problems, we develop a Differentially Private Knowledge Distillation (DPKD) framework to enable on-device deep learning as well as preserve training data privacy. We modify the conventional Private Aggregation of Teacher Ensembles (PATE) paradigm by compressing the knowledge acquired by the ensemble of teachers into a student model in a differentially private manner. The student model is then trained on both the labeled, public data and the distilled knowledge by adopting a mixed training algorithm. Extensive experiments on popular image datasets, as well as the real implementation on a mobile device show that DPKD can not only benefit from the distilled knowledge but also provide a strong differential privacy guarantee (ε=2$) with only marginal decreases in accuracy.
Year
DOI
Venue
2020
10.1145/3397271.3401259
SIGIR '20: The 43rd International ACM SIGIR conference on research and development in Information Retrieval Virtual Event China July, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-8016-4
1
PageRank 
References 
Authors
0.35
3
2
Name
Order
Citations
PageRank
Lingjuan Lyu1334.61
Chi-Hua Chen26618.92