Title
Variance-Reduced Proximal Stochastic Gradient Descent for Non-convex Composite optimization.
Abstract
Here we study non-convex composite optimization: first, a finite-sum of smooth but non-convex functions, and second, a general function that admits a simple proximal mapping. Most research on stochastic methods for composite optimization assumes convexity or strong convexity of each function. In this paper, we extend this problem into the non-convex setting using variance reduction techniques, such as prox-SVRG and prox-SAGA. We prove that, with a constant step size, both prox-SVRG and prox-SAGA are suitable for non-convex composite optimization, and help the problem converge to a stationary point within $O(1/epsilon)$ iterations. That is similar to the convergence rate seen with the state-of-the-art RSAG method and faster than stochastic gradient descent. Our analysis is also extended into the min-batch setting, which linearly accelerates the convergence. To the best of our knowledge, this is the first analysis of convergence rate of variance-reduced proximal stochastic gradient for non-convex composite optimization.
Year
Venue
Field
2016
arXiv: Machine Learning
Convergence (routing),Gradient method,Stochastic gradient descent,Mathematical optimization,Convexity,Proximal Gradient Methods,Stationary point,Rate of convergence,Variance reduction,Mathematics
DocType
Volume
Citations 
Journal
abs/1606.00602
0
PageRank 
References 
Authors
0.34
8
2
Name
Order
Citations
PageRank
Xiyu Yu1322.28
Dacheng Tao219032747.78