Title
Bandit-based Communication-Efficient Client Selection Strategies for Federated Learning
Abstract
Due to communication constraints and intermittent client availability in federated learning, only a subset of clients can participate in each training round. While most prior works assume uniform and unbiased client selection, recent work on biased client selection has shown that selecting clients with higher local losses can improve error convergence speed. However, previously proposed biased selection strategies either require additional communication cost for evaluating the exact local loss or utilize stale local loss, which can even make the model diverge. In this paper, we present a bandit-based communication-efficient client selection strategy UCB-CS that achieves faster convergence with lower communication overhead. We also demonstrate how client selection can be used to improve fairness.
Year
DOI
Venue
2020
10.1109/IEEECONF51394.2020.9443523
ACSSC
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Yae Jee Cho100.34
Samarth Gupta2135.60
Gauri Joshi330829.70
Osman Yagan443043.65