Title
FedMP: Federated Learning through Adaptive Model Pruning in Heterogeneous Edge Computing
Abstract
Federated learning (FL) has been widely adopted to train machine learning models over massive distributed data sources in edge computing. However, the existing FL frameworks usually suffer from the difficulties of resource limitation and edge heterogeneity. Herein, we design and implement FedMP, an efficient FL framework through adaptive model pruning. We theoretically analyze the impact of pruning ratio on model training performance, and propose to employ a Multi-Armed Bandit based online learning algorithm to adaptively determine different pruning ratios for heterogeneous edge nodes, even without any prior knowledge of their computation and communication capabilities. With adaptive model pruning, FedMP can not only reduce resource consumption but also achieve promising accuracy. To prevent the diverse structures of pruned models from affecting the training convergence, we further present a new parameter synchronization scheme, called Residual Recovery Synchronous Parallel (R2SP), and provide a theoretical convergence guarantee. Extensive experiments on the classical models and datasets demonstrate that FedMP is effective for different heterogeneous scenarios and data distributions, and can provide up to 4.1× speedup compared to the existing FL methods.
Year
DOI
Venue
2022
10.1109/ICDE53745.2022.00062
2022 IEEE 38th International Conference on Data Engineering (ICDE)
Keywords
DocType
ISSN
Internet of Things,Federated Learning,Adaptive Model Pruning,Heterogeneity
Conference
1063-6382
ISBN
Citations 
PageRank 
978-1-6654-0884-4
0
0.34
References 
Authors
8
6
Name
Order
Citations
PageRank
Z. B. Jiang124236.08
Yongjun Xu215835.23
Hongbing Xu319516.46
Zhiqiang Wang415835.98
Chunming Qiao53971400.49
Yuhai Zhao610919.49