Title
OPTIMAL IMPORTANCE SAMPLING FOR FEDERATED LEARNING
Abstract
Federated learning involves a mixture of centralized and decentralized processing tasks, where a server regularly selects a sample of the agents and these in turn sample their local data to compute stochastic gradients for their learning updates. The sampling of both agents and data is generally uniform; however, in this work we consider non-uniform sampling. We derive optimal importance sampling strategies for both agent and data selection and show that under convexity and Lipschitz assumptions, non-uniform sampling without replacement improves the performance of the original FedAvg algorithm. We run experiments on a regression and classification problem to illustrate the theoretical results.
Year
DOI
Venue
2021
10.1109/ICASSP39728.2021.9413655
2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021)
Keywords
DocType
Citations 
federated learning, importance sampling, asynchronous SGD, non-IID data, heterogeneous agents
Conference
0
PageRank 
References 
Authors
0.34
1
3
Name
Order
Citations
PageRank
Elsa Rizk100.34
Stefan Vlaski22311.39
Ali H. Sayed39134667.71