Title
Communication-Efficient Asynchronous Federated Learning In Resource-Constrained Edge Computing
Abstract
Federated learning (FL) has been widely used to train machine learning models over massive data in edge computing. However, the existing FL solutions may cause long training time and/or high resource (e.g., bandwidth) cost, and thus cannot be directly applied for resource-constrained edge nodes, such as base stations and access points. In this paper, we propose a novel communication-efficient asynchronous federated learning (CE-AFL) mechanism, in which the parameter server will aggregate the local model updates only from a certain fraction alpha, with 0 < alpha < 1, of all edge nodes by their arrival order in each epoch. As a case study, we design efficient algorithms to determine the optimal value of alpha for two cases of CE-AFL, single learning task and multiple learning tasks, under bandwidth constraints. We formally prove the convergence of the proposed algorithm. We evaluate the performance of our algorithm with experiments on Jetson TX2, deep learning workstation and extensive simulations. Both experimental results and simulation results on the classical models and datasets show the effectiveness of our proposed mechanism and algorithms. For example, CE-AFL can reduce the training time by about 69% while achieving similar accuracy, and improve the accuracy of the trained models by about 18% under resource constraints, compared with the state-of-the-art solutions.
Year
DOI
Venue
2021
10.1016/j.comnet.2021.108429
COMPUTER NETWORKS
Keywords
DocType
Volume
Federated learning, Edge computing, Communication-efficient, Asynchronous
Journal
199
ISSN
Citations 
PageRank 
1389-1286
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Jianchun Liu182.83
Hongli Xu250285.92
Yang Xu3476.27
Zhenguo Ma400.68
Zhiyuan Wang531.41
Qian Chen663058.09
He Huang782965.14