Title
Achieving Consensus in Privacy-Preserving Decentralized Learning
Abstract
Machine learning algorithms have been widely deployed on decentralized systems so that users with private, local data can jointly contribute to a better generalized model. One promising approach is Aggregation of Teacher Ensembles, which transfers knowledge of locally trained models to a global one without releasing any private data. However, previous methods largely focus on privately aggregating the local results without concerning their validity, which easily leads to erroneous aggregation results especially when data is unbalanced across different users. Hence, we propose a private consensus protocol - which reveals nothing else but the label with the highest votes, in the condition that the number of votes exceeds a given threshold. The purpose is to filter out undesired aggregation results that could hurt the aggregator model performance. Our protocol also guarantees differential privacy such that any adversary with auxiliary information cannot gain any additional knowledge from the results. We show that with our protocol, we achieve the same privacy level with an improved accuracy compared to previous works.
Year
DOI
Venue
2020
10.1109/ICDCS47774.2020.00086
2020 IEEE 40th International Conference on Distributed Computing Systems (ICDCS)
Keywords
DocType
ISSN
Differential privacy,decentralized machine learning
Conference
1063-6927
ISBN
Citations 
PageRank 
978-1-7281-7003-9
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Liyao Xiang1216.50
Lingdong Wang200.68
Shufan Wang300.34
Baochun Li49416614.20