Title
Projected Semi-Stochastic Gradient Descent Method with Mini-Batch Scheme under Weak Strong Convexity Assumption.
Abstract
We propose a projected semi-stochastic gradient descent method with mini-batch for improving both the theoretical complexity and practical performance of the general stochastic gradient descent method (SGD). We are able to prove linear convergence under weak strong convexity assumption. This requires no strong convexity assumption for minimizing the sum of smooth convex functions subject to a compact polyhedral set, which remains popular across machine learning community. Our PS2GD preserves the low-cost per iteration and high optimization accuracy via stochastic gradient variance-reduced technique, and admits a simple parallel implementation with mini-batches. Moreover, PS2GD is also applicable to dual problem of SVM with hinge loss.
Year
Venue
Field
2016
arXiv: Learning
Gradient method,Gradient descent,Stochastic gradient descent,Mathematical optimization,Convexity,Hinge loss,Descent direction,Nonlinear conjugate gradient method,Backpropagation,Mathematics
DocType
Volume
Citations 
Journal
abs/1612.05356
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Jie Liu1613.25
Martin Takác275249.49