Title
Stochastic Recursive Gradient Algorithm for Nonconvex Optimization.
Abstract
In this paper, we study and analyze the mini-batch version of StochAstic Recursive grAdient algoritHm (SARAH), a method employing the stochastic recursive gradient, for solving empirical loss minimization for the case of nonconvex losses. We provide a sublinear convergence rate (to stationary points) for general nonconvex functions and a linear convergence rate for gradient dominated functions, both of which have some advantages compared to other modern stochastic gradient algorithms for nonconvex losses.
Year
Venue
Field
2017
arXiv: Machine Learning
Sublinear function,Gradient method,Mathematical optimization,Algorithm,Loss minimization,Stationary point,Rate of convergence,Mathematics,Recursion
DocType
Volume
Citations 
Journal
abs/1705.07261
6
PageRank 
References 
Authors
0.46
5
4
Name
Order
Citations
PageRank
Lam M. Nguyen1438.95
Jie Liu2613.25
Katya Scheinberg374469.50
Martin Takác475249.49