Abstract | ||
---|---|---|
In this paper, we study greedy variants of quasi-Newton methods. They are based on the updating formulas from a certain subclass of the Broyden family. In particular, this subclass includes the well-known DFP, BFGS, and SR1 updates. However, in contrast to the classical quasi-Newton methods, which use the difference of successive iterates for updating the Hessian approximations, our methods apply basis vectors, greedily selected so as to maximize a certain measure of progress. For greedy quasi-Newton methods, we establish an explicit nonasymptotic bound on their rate of local superlinear convergence, as applied to minimizing strongly convex and strongly self-concordant functions (and, in particular, to strongly convex functions with Lipschitz continuous Hessian). The established superlinear convergence rate contains a contraction factor, which depends on the square of the iteration counter. We also show that greedy quasi-Newton methods produce Hessian approximations whose deviation from the exact Hessians linearly converges to zero. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1137/20M1320651 | SIAM JOURNAL ON OPTIMIZATION |
Keywords | DocType | Volume |
quasi-Newton methods, Broyden family, SR1, DFP, BFGS, superlinear convergence, local coverage, rate of convergence | Journal | 31 |
Issue | ISSN | Citations |
1 | 1052-6234 | 0 |
PageRank | References | Authors |
0.34 | 0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Rodomanov Anton | 1 | 0 | 0.34 |
Yurii Nesterov | 2 | 1800 | 168.77 |