Title
Federated Multi-Task Learning.
Abstract
Federated learning poses new statistical and systems challenges in training machine learning models over distributed networks of devices. In this work, we show that multi-task learning is naturally suited to handle the statistical challenges of this setting, and propose a novel systems-aware optimization method, MOCHA, that is robust to practical systems issues. Our method and theory for the first time consider issues of high communication cost, stragglers, and fault tolerance for distributed multi-task learning. The resulting method achieves significant speedups compared to alternatives in the federated setting, as we demonstrate through simulations on real-world federated datasets.
Year
Venue
DocType
2017
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017)
Conference
Volume
ISSN
Citations 
30
1049-5258
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Virginia Smith133920.52
Chiang, Chao-Kai200.34
Maziar Sanjabi318913.81
Talwalkar, Ameet4139466.51