Title
Fast-Convergent Federated Learning with Adaptive Weighting
Abstract
Federated learning (FL) enables resource-constrained edge nodes to collaboratively learn a global model under the orchestration of a central server while keeping privacy-sensitive data locally. The non-independentand-identically-distributed (non-IID) data samples across participating nodes slow model training and impose additional communication rounds for FL to converge. In this paper, we propose Federated Adaptive Weighting (FedAdp) algorithm that aims to accelerate model convergence under the presence of nodes with non-IID dataset. Through mathematical and empirical analysis, we observe the implicit connection between the gradient of local training and data distribution on local node. We then propose to assign different weight for updating global model based on node contribution adaptively through each training round, which is measured by the angle between local gradient vector and global gradient vector, and is quantified by a designed non-linear mapping function. The simple yet effective strategy can reinforce positive (suppress negative) node contribution dynamically, that results in communication round reduction drastically. With extensive experiments performed in Pytorch and PySyft, we show that FL training with FedAdp can reduce the number of communication rounds by up to 54.1% on MNIST dataset and up to 45.4% on FashionMNIST dataset, as compared to the commonly adopted Federated Averaging (FedAvg) algorithm.
Year
DOI
Venue
2021
10.1109/ICC42927.2021.9500890
IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021)
DocType
ISSN
Citations 
Conference
1550-3607
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Hongda Wu100.34
Ping Wang24153216.93