Title
CORK: A privacy-preserving and lossless federated learning scheme for deep neural network
Abstract
With the advance of machine learning technology and especially the explosive growth of big data, federated learning, which allows multiple participants to jointly train a high-quality global machine learning model, has gained extensive attention. However, in federated learning, it has been proved that inference attacks could reveal sensitive information from both local updates and global model parameters, which threatens user privacy greatly. Aiming at the challenge, in this paper, a privacy-preserving and lossless federated learning scheme, named CORK, is proposed for deep neural network. With CORK, multiple participants can train a global model securely and accurately with the assistance of an aggregation server. Specifically, we first design a drop-tolerant secure aggregation algorithm FTSA, which ensures the confidentiality of local updates. Then, a lossless model perturbation mechanism PTSP is proposed to protect sensitive data in global model parameters. Furthermore, the neuron pruning operation in PTSP can reduce the scale of models, which thus improves the computation and communication efficiency significantly. Detailed security analysis shows that CORK can resist inference attacks on both local updates and global model parameters. In addition, CORK is implemented with real MNIST and CIFAR-10 datasets, and the experimental results demonstrate that CORK is indeed effective and efficient.
Year
DOI
Venue
2022
10.1016/j.ins.2022.04.052
Information Sciences
Keywords
DocType
Volume
Federated learning,Deep neural network,Privacy-preserving,Secure aggregation,Model perturbation
Journal
603
ISSN
Citations 
PageRank 
0020-0255
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Jiaqi Zhao100.34
Hui Zhu28317.00
Fengwei Wang393.51
Rongxing Lu45091301.87
Hui Li539435.42
Jingwei Tu600.34
Jie Shen700.34