Title | ||
---|---|---|
SARAH: A Novel Method for Machine Learning Problems Using Stochastic Recursive Gradient. |
Abstract | ||
---|---|---|
In this paper, we propose a StochAstic Recursive grAdient algoritHm (SARAH), as well as its practical variant SARAH+, as a novel approach to the finite-sum minimization problems. Different from the vanilla SGD and other modern stochastic methods such as SVRG, S2GD, SAG and SAGA, SARAH admits a simple recursive framework for updating stochastic gradient estimates; when comparing to SAG/SAGA, SARAH does not require a storage of past gradients. The linear convergence rate of SARAH is proven under strong convexity assumption. We also prove a linear convergence rate (in the strongly convex case) for an inner loop of SARAH, the property that SVRG does not possess. Numerical experiments demonstrate the efficiency of our algorithm. |
Year | Venue | DocType |
---|---|---|
2017 | ICML | Conference |
Volume | ISSN | Citations |
abs/1703.00102 | Proceedings of the 34th International Conference on Machine
Learning, PMLR 70:2613-2621, 2017 | 20 |
PageRank | References | Authors |
0.64 | 12 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Lam M. Nguyen | 1 | 43 | 8.95 |
Jie Liu | 2 | 61 | 3.25 |
Katya Scheinberg | 3 | 744 | 69.50 |
Martin Takác | 4 | 752 | 49.49 |