Title
COFEL: Communication-Efficient and Optimized Federated Learning with Local Differential Privacy
Abstract
Federated learning can collaboratively train a global model without gathering clients’ private data. Many works focus on reducing communication cost by designing kinds of client selection method or averaging algorithm. But they all consider whether the client will participant or not, and the training time could not be reduced as data size of update for each client is not changed. We proposed COFEL, a novel federated learning system which can both reduce the communication time by layer-based parameter selection and enhance the privacy protection by applying local differential privacy mechanism on the selected parameters. We present COFEL-AVG algorithm for global aggregation and designed layer-based parameter selection method which can select the valuable parameters for global aggregation to optimize the communication and training process. And it can reduce the update data size as only selected part will be transferred. We compared with traditional federated learning system and CMFL which also applies a parameter selection method but model-based and performed experiments on MNIST, Fashion-MNIST and CIFAR-10 to verify the effectiveness of COFEL. The results denoted that it can improve at most 22.8% accuracy compared with CMFL on CIFAR-10 and reduce around 20% and 48% training time to reach an accuracy of 0.85 compared with traditional FL and CMFL on Fashion-MNIST dataset.
Year
DOI
Venue
2021
10.1109/ICC42927.2021.9500632
ICC 2021 - IEEE International Conference on Communications
Keywords
DocType
ISSN
Federated learning,local differential privacy,parameter selection,communication-efficient
Conference
1550-3607
ISBN
Citations 
PageRank 
978-1-7281-7123-4
2
0.37
References 
Authors
0
3
Name
Order
Citations
PageRank
Zhuotao Lian121.38
Weizheng Wang261.43
Chunhua Su320.37