Title
HybridAlpha: An Efficient Approach for Privacy-Preserving Federated Learning
Abstract
Federated learning has emerged as a promising approach for collaborative and privacy-preserving learning. Participants in a federated learning process cooperatively train a model by exchanging model parameters instead of the actual training data, which they might want to keep private. However, parameter interaction and the resulting model still might disclose information about the training data used. To address these privacy concerns, several approaches have been proposed based on differential privacy and secure multiparty computation (SMC), among others. They often result in large communication overhead and slow training time. In this paper, we propose HybridAlpha, an approach for privacy-preserving federated learning employing an SMC protocol based on functional encryption. This protocol is simple, efficient and resilient to participants dropping out. We evaluate our approach regarding the training time and data volume exchanged using a federated learning process to train a CNN on the MNIST data set. Evaluation against existing crypto-based SMC solutions shows that HybridAlpha can reduce the training time by 68% and data transfer volume by 92% on average while providing the same model performance and privacy guarantees as the existing solutions.
Year
DOI
Venue
2019
10.1145/3338501.3357371
Proceedings of the 12th ACM Workshop on Artificial Intelligence and Security
Keywords
Field
DocType
federated learning, functional encryption, neural networks, privacy
Computer security,Computer science
Conference
ISBN
Citations 
PageRank 
978-1-4503-6833-9
12
0.54
References 
Authors
26
5
Name
Order
Citations
PageRank
Runhua Xu1182.36
Nathalie Baracaldo211112.47
yi zhou3885.26
Ali Anwar411314.83
Heiko Ludwig51278147.99