Title
Sub-sampled Newton methods
Abstract
For large-scale finite-sum minimization problems, we study non-asymptotic and high-probability global as well as local convergence properties of variants of Newton’s method where the Hessian and/or gradients are randomly sub-sampled. For Hessian sub-sampling, using random matrix concentration inequalities, one can sub-sample in a way that second-order information, i.e., curvature, is suitably preserved. For gradient sub-sampling, approximate matrix multiplication results from randomized numerical linear algebra provide a way to construct the sub-sampled gradient which contains as much of the first-order information as possible. While sample sizes all depend on problem specific constants, e.g., condition number, we demonstrate that local convergence rates are problem-independent.
Year
DOI
Venue
2019
10.1007/s10107-018-1346-5
Mathematical Programming
Keywords
DocType
Volume
Newton-type methods,Local and global convergence,Sub-sampling,49M15,65K05,90C25,90C06
Journal
174.0
Issue
ISSN
Citations 
SP1-2
1436-4646
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Farbod Roosta-Khorasani11029.25
Michael W. Mahoney23297218.10