Title
An investigation of Newton-Sketch and subsampled Newton methods
Abstract
Sketching, a dimensionality reduction technique, has received much attention in the statistics community. In this paper, we study sketching in the context of Newton's method for solving finite-sum optimization problems in which the number of variables and data points are both large. We study two forms of sketching that perform dimensionality reduction in data space: Hessian subsampling and randomized Hadamard transformations. Each has its own advantages, and their relative tradeoffs have not been investigated in the optimization literature. Our study focuses on practical versions of the two methods in which the resulting linear systems of equations are solved approximately, at every iteration, using an iterative solver. The advantages of using the conjugate gradient method vs. a stochastic gradient iteration are revealed through a set of numerical experiments, and a complexity analysis of the Hessian subsampling method is presented.
Year
DOI
Venue
2017
10.1080/10556788.2020.1725751
OPTIMIZATION METHODS & SOFTWARE
Keywords
Field
DocType
Sketching,subsampling,Newton's method,machine learning,stochastic optimization
Conjugate gradient method,Spectral properties,Mathematical optimization,Algorithm,Optimization problem,Mathematics,Sketch
Journal
Volume
Issue
ISSN
35
SP4
1055-6788
Citations 
PageRank 
References 
4
0.38
10
Authors
3
Name
Order
Citations
PageRank
Albert S. Berahas1214.05
Raghu Bollapragada240.38
Jorge Nocedal33276301.50