Title
Towards Mitigating Device Heterogeneity in Federated Learning via Adaptive Model Quantization
Abstract
ABSTRACTFederated learning (FL) is increasingly becoming the norm for training models over distributed and private datasets. Major service providers rely on FL to improve services such as text auto-completion, virtual keyboards, and item recommendations. Nonetheless, training models with FL in practice requires significant amount of time (days or even weeks) because FL tasks execute in highly heterogeneous environments where devices only have widespread yet limited computing capabilities and network connectivity conditions. In this paper, we focus on mitigating the extent of device heterogeneity, which is a main contributing factor to training time in FL. We propose AQFL, a simple and practical approach leveraging adaptive model quantization to homogenize the computing resources of the clients. We evaluate AQFL on five common FL benchmarks. The results show that, in heterogeneous settings, AQFL obtains nearly the same quality and fairness of the model trained in homogeneous settings.
Year
DOI
Venue
2021
10.1145/3437984.3458839
EUROSYS
DocType
Citations 
PageRank 
Conference
2
0.38
References 
Authors
0
2
Name
Order
Citations
PageRank
Ahmed M. Abdelmoniem152.13
Marco Canini285760.21