Abstract | ||
---|---|---|
In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate $Oleft(frac{1}{varepsilon^2}right)$ improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of Hazan and Kale [2012] with convergence rate $Oleft(frac{1}{varepsilon^4}right)$. |
Year | Venue | DocType |
---|---|---|
2017 | ICML | Conference |
Volume | Citations | PageRank |
abs/1703.05840 | 2 | 0.41 |
References | Authors | |
11 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Guanghui Lan | 1 | 1212 | 66.26 |
Sebastian Pokutta | 2 | 267 | 32.02 |
Yi Zhou | 3 | 65 | 17.55 |
Daniel Zink | 4 | 7 | 1.22 |