Title
SPIDER: Near-Optimal Non-Convex Optimization via Stochastic Path Integrated Differential Estimator.
Abstract
In this paper, we propose a new technique named Stochastic Path-Integrated Differential EstimatoR (SPIDER), which can be used to track many deterministic quantities of interests with significantly reduced computational cost. Combining SPIDER with the method of normalized gradient descent, we propose SPIDER-SFO that solve non-convex stochastic optimization problems using stochastic gradients only. We provide a few error-bound results on its convergence rates. Specially, we prove that the SPIDER-SFO algorithm achieves a gradient computation cost of O (min(n(1/2)epsilon(-2), epsilon(-3))) to find an epsilon-approximate first-order stationary point. In addition, we prove that SPIDER-SFO nearly matches the algorithmic lower bound for finding stationary point under the gradient Lipschitz assumption in the finite-sum setting. Our SPIDER technique can be further applied to find an (epsilon, O(epsilon(0.5)))-approximate second-order stationary point at a gradient computation cost of (O) over tilde (min(n(1/2)epsilon(-2) + epsilon(-2.5), epsilon(-3))).
Year
Venue
DocType
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
Journal
Volume
ISSN
Citations 
31
1049-5258
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Cong Fang1177.14
Chris Junchi Li251.11
Zhouchen Lin34805203.69
Zhang, Tong47126611.43