Title
Conditional Accelerated Lazy Stochastic Gradient Descent.
Abstract
In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate $Oleft(frac{1}{varepsilon^2}right)$ improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of Hazan and Kale [2012] with convergence rate $Oleft(frac{1}{varepsilon^4}right)$.
Year
Venue
DocType
2017
ICML
Conference
Volume
Citations 
PageRank 
abs/1703.05840
2
0.41
References 
Authors
11
4
Name
Order
Citations
PageRank
Guanghui Lan1121266.26
Sebastian Pokutta226732.02
Yi Zhou36517.55
Daniel Zink471.22