Title
Sparse Communication for Federated Learning
Abstract
Federated learning trains a model on a centralized server using datasets distributed over a massive amount of edge devices. Since federated learning does not send local data from edge devices to the server, it preserves data privacy. It transfers the local models from edge devices instead of the local data. However, communication costs are frequently a problem in federated learning. This paper proposes a novel method to reduce the required communication cost for federated learning by transferring only top updated parameters in neural network models. The proposed method allows adjusting the criteria of updated parameters to trade-off the reduction of communication costs and the loss of model accuracy. We evaluated the proposed method using diverse models and datasets and found that it can achieve comparable performance to transfer original models for federated learning. As a result, the proposed method has achieved a reduction of the required communication costs around 90% when compared to the conventional method for VGG16. Furthermore, we found out that the proposed method is able to reduce the communication cost of a large model more than of a small model due to the different threshold of updated parameters in each model architecture.
Year
DOI
Venue
2022
10.1109/ICFEC54809.2022.00008
2022 IEEE 6th International Conference on Fog and Edge Computing (ICFEC)
Keywords
DocType
ISBN
Sparse Communication,Edge Computing,Federated Learning,Neural Networks
Conference
978-1-6654-9525-7
Citations 
PageRank 
References 
0
0.34
12
Authors
6