Title
Asynchronous Decentralized Accelerated Stochastic Gradient Descent
Abstract
In this paper, we introduce an asynchronous decentralized accelerated stochastic gradient descent type of algorithm for decentralized stochastic optimization. Considering communication and synchronization costs are the major bottlenecks for decentralized optimization, we attempt to reduce these costs from an algorithmic design aspect, in particular, we are able to reduce the number of agents invol...
Year
DOI
Venue
2018
10.1109/JSAIT.2021.3080256
IEEE Journal on Selected Areas in Information Theory
Keywords
Field
DocType
Complexity theory,Convergence,Signal processing algorithms,Optimization,Delays,Support vector machines,Convex functions
Discrete mathematics,Asynchronous communication,Stochastic gradient descent,Stochastic optimization,Synchronization,Regular polygon,Communication complexity,Convex function,Sampling (statistics),Mathematics
Journal
Volume
Issue
Citations 
2
2
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Guanghui Lan1121266.26
yi zhou2885.26