Title
Limited-memory BFGS with displacement aggregation
Abstract
A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS (a.k.a. L-BFGS) method such that the resulting (inverse) Hessian approximations are equal to those that would be derived from a full-memory BFGS method. This means that, if a sufficiently large number of pairs are stored, then an optimization algorithm employing the limited-memory method can achieve the same theoretical convergence properties as when full-memory (inverse) Hessian approximations are stored and employed, such as a local superlinear rate of convergence under assumptions that are common for attaining such guarantees. To the best of our knowledge, this is the first work in which a local superlinear convergence rate guarantee is offered by a quasi-Newton scheme that does not either store all curvature pairs throughout the entire run of the optimization algorithm or store an explicit (inverse) Hessian approximation. Numerical results are presented to show that displacement aggregation within an adaptive L-BFGS scheme can lead to better performance than standard L-BFGS.
Year
DOI
Venue
2022
10.1007/s10107-021-01621-6
Mathematical Programming
Keywords
DocType
Volume
Nonlinear optimization, Quasi-Newton algorithms, Broyden–Fletcher–Goldfarb–Shanno (BFGS), Limited-memory BFGS, Superlinear convergence, 49M37, 65K05, 65K10, 90C30, 90C53
Journal
194
Issue
ISSN
Citations 
1
0025-5610
0
PageRank 
References 
Authors
0.34
18
3
Name
Order
Citations
PageRank
Albert S. Berahas100.34
Frank E. Curtis243225.71
Baoyu Zhou300.34