Title
Towards More Efficient Stochastic Decentralized Learning: Faster Convergence and Sparse Communication.
Abstract
Recently, the decentralized optimization problem is attracting growing attention. Most existing methods are deterministic with high per-iteration cost and have a convergence rate quadratically depending on the problem condition number. Besides, the dense communication is necessary to ensure the convergence even if the dataset is sparse. In this paper, we generalize the decentralized optimization problem to a monotone operator root finding problem, and propose a stochastic algorithm named DSBA that (i) converges geometrically with a rate linearly depending on the problem condition number, and (ii) can be implemented using sparse communication only. Additionally, DSBA handles learning problems like AUC-maximization which cannot be tackled efficiently in the decentralized setting. Experiments on convex minimization and AUC-maximization validate the efficiency of our method.
Year
Venue
DocType
2018
ICML
Conference
Volume
Citations 
PageRank 
abs/1805.09969
2
0.36
References 
Authors
10
5
Name
Order
Citations
PageRank
Zebang Shen1179.36
Aryan Mokhtari221124.93
Tengfei Zhou3225.08
Peilin Zhao4136580.09
Hui Qian55913.26