Abstract | ||
---|---|---|
This paper presents a finite-difference quasi-Newton method for the minimization of noisy functions. The method takes advantage of the scalability and power of BFGS updating, and employs an adaptive procedure for choosing the differencing interval h based on the noise estimation techniques of Hamming [Introduction to Applied Numerical Analysis, Courier Corporation, North Chelmsford, MA, 2012] and More and Wild [SIAM J. Sci. Comput., 33 (2011), pp. 1292-1314]. This noise estimation procedure and the selection of h are inexpensive but not always accurate, and to prevent failures the algorithm incorporates a recovery mechanism that takes appropriate action in the case when the line-search procedure is unable to produce an acceptable point. A novel convergence analysis is presented that considers the effect of a noisy line-search procedure. Numerical experiments comparing the method to a function interpolating trust-region method are presented. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1137/18M1177718 | SIAM JOURNAL ON OPTIMIZATION |
Keywords | Field | DocType |
derivative-free optimization,nonlinear optimization,stochastic optimization | Convergence (routing),Hamming code,Stochastic optimization,Derivative-free optimization,Mathematical optimization,Interpolation,Nonlinear programming,Algorithm,Numerical analysis,Broyden–Fletcher–Goldfarb–Shanno algorithm,Mathematics | Journal |
Volume | Issue | ISSN |
29 | 2 | 1052-6234 |
Citations | PageRank | References |
3 | 0.46 | 16 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Albert S. Berahas | 1 | 21 | 4.05 |
Richard H. Byrd | 2 | 2234 | 227.38 |
Jorge Nocedal | 3 | 3276 | 301.50 |