Title
Decentralized Asynchronous Stochastic Gradient Descent: Convergence Rate Analysis
Abstract
Decentralized algorithms for multi-agent networks have attracted a considerable research interest. Stochastic gradient descent and its variations are popularly used for developing such algorithms. This paper considers a stochastic gradient descent algorithm in which each node is randomly selected to carry out the update. The stringent computational and communication requirements of the synchronous framework are overcome by proposing an asynchronous variant that allows updates to be carried out using delayed gradients. The performance of the proposed algorithm is analyzed by developing non-asymptotic bounds on the optimality gap, as a function of the number of iterations, and for various diminishing step-size rules. The bounds indicate that the effect of asynchrony on the algorithm convergence rate is minimal. The theoretical findings of this work are further illustrated through solving a distributed estimation problem over a large network. We conclude by presenting the performance comparison of the proposed algorithm against the classical cyclic incremental algorithm.
Year
DOI
Venue
2018
10.1109/SPCOM.2018.8724408
2018 International Conference on Signal Processing and Communications (SPCOM)
Keywords
Field
DocType
Convergence,Delays,Message passing,Resource management,Optimization,Linear programming
Asynchronous communication,Stochastic gradient descent,Asynchrony,Mathematical optimization,Pattern recognition,Computer science,Artificial intelligence,Rate of convergence,Algorithm convergence
Conference
ISSN
ISBN
Citations 
2474-9168
978-1-5386-3821-7
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Amrit Singh Bedi1169.43
Hrusikesha Pradhan2141.23
Ketan Rajawat312425.44