Title
SAFA: A Semi-Asynchronous Protocol for Fast Federated Learning With Low Overhead
Abstract
Federated learning (FL) has attracted increasing attention as a promising approach to driving a vast number of end devices with artificial intelligence. However, it is very challenging to guarantee the efficiency of FL considering the unreliable nature of end devices while the cost of device-server communication cannot be neglected. In this article, we propose SAFA, a semi-asynchronous FL protocol, to address the problems in federated learning such as low round efficiency and poor convergence rate in extreme conditions (e.g., clients dropping offline frequently). We introduce novel designs in the steps of model distribution, client selection and global aggregation to mitigate the impacts of stragglers, crashes and model staleness in order to boost efficiency and improve the quality of the global model. We have conducted extensive experiments with typical machine learning tasks. The results demonstrate that the proposed protocol is effective in terms of shortening federated round duration, reducing local resource wastage, and improving the accuracy of the global model at an acceptable communication cost.
Year
DOI
Venue
2021
10.1109/TC.2020.2994391
IEEE Transactions on Computers
Keywords
DocType
Volume
Protocols,Training,Machine learning,Data models,Optimization,Convergence,Distributed databases
Journal
70
Issue
ISSN
Citations 
5
0018-9340
10
PageRank 
References 
Authors
0.59
0
6
Name
Order
Citations
PageRank
Wentai Wu1343.77
Ligang He254256.73
Weiwei Lin314713.95
Rui Mao436841.23
xiaoming kang5284.10
stephen a jarvis627422.39