Title
A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization
Abstract
In this paper, a novel stochastic extra-step quasi-Newton method is developed to solve a class of nonsmooth nonconvex composite optimization problems. We assume that the gradient of the smooth part of the objective function can only be approximated by stochastic oracles. The proposed method combines general stochastic higher order steps derived from an underlying proximal type fixed-point equation with additional stochastic proximal gradient steps to guarantee convergence. Based on suitable bounds on the step sizes, we establish global convergence to stationary points in expectation and an extension of the approach using variance reduction techniques is discussed. Motivated by large-scale and big data applications, we investigate a stochastic coordinate-type quasi-Newton scheme that allows to generate cheap and tractable stochastic higher order directions. Finally, numerical results on large-scale logistic regression and deep learning problems show that our proposed algorithm compares favorably with other state-of-the-art methods.
Year
DOI
Venue
2022
10.1007/s10107-021-01629-y
Mathematical Programming
Keywords
DocType
Volume
Nonsmooth stochastic optimization, Stochastic approximation, Global convergence, Stochastic higher order method, Stochastic quasi-Newton scheme, 90C06, 90C15, 90C26, 90C53
Journal
194
Issue
ISSN
Citations 
1
0025-5610
0
PageRank 
References 
Authors
0.34
87
4
Name
Order
Citations
PageRank
Yang Minghan100.34
Andre Milzarek2101.97
Zaiwen Wen393440.20
Zhang, Tong47126611.43