Title
Accelerating Asynchronous Algorithms for Convex Optimization by Momentum Compensation.
Abstract
Asynchronous algorithms have attracted much attention recently due to the crucial demands on solving large-scale optimization problems. However, the accelerated versions of asynchronous algorithms are rarely studied. In this paper, we propose the momentum compensation technique to accelerate asynchronous algorithms for convex problems. Specifically, we first accelerate the plain Asynchronous Gradient Descent, which achieves a faster $O(1/sqrt{epsilon})$ (v.s. $O(1/epsilon)$) convergence rate for non-strongly convex functions, and $O(sqrt{kappa}log(1/epsilon))$ (v.s. $O(kappa log(1/epsilon))$) for strongly convex functions to reach an $epsilon$- approximate minimizer with the condition number $kappa$. We further apply the technique to accelerate modern stochastic asynchronous algorithms such as Asynchronous Stochastic Coordinate Descent and Asynchronous Stochastic Gradient Descent. Both of the resultant practical algorithms are faster than existing ones by order. To the best of our knowledge, we are the first to consider accelerated algorithms that allow updating by delayed gradients and are the first to propose truly accelerated asynchronous algorithms. Finally, the experimental results on a shared memory system show that acceleration can lead to significant performance gains on ill-conditioned problems.
Year
Venue
Field
2018
arXiv: Optimization and Control
Discrete mathematics,Asynchronous communication,Condition number,Stochastic gradient descent,Gradient descent,Convex function,Rate of convergence,Coordinate descent,Convex optimization,Mathematics
DocType
Volume
Citations 
Journal
abs/1802.09747
2
PageRank 
References 
Authors
0.38
9
3
Name
Order
Citations
PageRank
Cong Fang1177.14
Yameng Huang220.38
Zhouchen Lin34805203.69