Title
Trust-Region Newton-Cg With Strong Second-Order Complexity Guarantees For Nonconvex Optimization
Abstract
Worst-case complexity guarantees for nonconvex optimization algorithms have been a topic of growing interest. Multiple frameworks that achieve the best known complexity bounds among a broad class of first- and second-order strategies have been proposed. These methods have often been designed primarily with complexity guarantees in mind and, as a result, represent a departure from the algorithms that have proved to be the most effective in practice. In this paper, we consider trust-region Newton methods, one of the most popular classes of algorithms for solving nonconvex optimization problems. By introducing slight modifications to the original scheme, we obtain two methods-one based on exact subproblem solves and one exploiting inexact subproblem solves as in the popular "trust-region Newton-conjugate gradient" (trust-region Newton-CG) method-with iteration and operation complexity bounds that match the best known bounds for the aforementioned class of first- and second-order methods. The resulting trust-region Newton-CG method also retains the attractive practical behavior of classical trust-region Newton-CG, which we demonstrate with numerical comparisons on a standard benchmark test set.
Year
DOI
Venue
2021
10.1137/19M130563X
SIAM JOURNAL ON OPTIMIZATION
Keywords
DocType
Volume
smooth nonconvex optimization, trust-region methods, Newton's method, conjugate gradient method, Lanczos method, worst-case complexity, negative curvature
Journal
31
Issue
ISSN
Citations 
1
1052-6234
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Frank E. Curtis143225.71
Daniel P. Robinson226121.51
Royer Clément300.34
S. J. Wright44391372.21