Title
Nonconvex Finite-Sum Optimization Via SCSG Methods.
Abstract
We develop a class of algorithms, as variants of the stochastically controlled stochastic gradient (SCSG) methods [21], for the smooth non-convex finite-sum optimization problem. Assuming the smoothness of each component, the complexity of SCSG to reach a stationary point with E parallel to del f(x)parallel to(2) <= epsilon is O (min{epsilon(-5/3), epsilon(-1)n(2/3)}), which strictly outperforms the stochastic gradient descent. Moreover, SCSG is never worse than the state-of-the-art methods based on variance reduction and it significantly outperforms them when the target accuracy is low. A similar acceleration is also achieved when the functions satisfy the Polyak-Lojasiewicz condition. Empirical experiments demonstrate that SCSG outperforms stochastic gradient methods on training multi-layers neural networks in terms of both training and validation loss.
Year
Venue
DocType
2017
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017)
Conference
Volume
ISSN
Citations 
30
1049-5258
14
PageRank 
References 
Authors
0.53
15
4
Name
Order
Citations
PageRank
Lihua Lei1245.52
Ju, Cheng2151.22
Jianbo Chen3824.35
Michael I. Jordan4312203640.80