Title
A Self-Correcting Variable-Metric Algorithm for Stochastic Optimization.
Abstract
An algorithm for stochastic (convex or nonconvex) optimization is presented. The algorithm is variable-metric in the sense that, in each iteration, the step is computed through the product of a symmetric positive definite scaling matrix and a stochastic (mini-batch) gradient of the objective function, where the sequence of scaling matrices is updated dynamically by the algorithm. A key feature of the algorithm is that it does not overly restrict the manner in which the scaling matrices are updated. Rather, the algorithm exploits fundamental self-correcting properties of BFGS-type updating--properties that have been overlooked in other attempts to devise quasi-Newton methods for stochastic optimization. Numerical experiments illustrate that the method and a limited memory variant of it are stable and outperform (mini-batch) stochastic gradient and other quasi-Newton methods when employed to solve a few machine learning problems.
Year
Venue
Field
2016
ICML
Mathematical optimization,Stochastic optimization,Matrix (mathematics),Computer science,Positive-definite matrix,Algorithm,Regular polygon,Artificial intelligence,Scaling,Population-based incremental learning,Machine learning
DocType
Citations 
PageRank 
Conference
5
0.43
References 
Authors
12
1
Name
Order
Citations
PageRank
Frank E. Curtis143225.71