Abstract | ||
---|---|---|
We propose and analyze an asynchronously parallel optimization algorithm for finding multiple, high-quality minima of nonlinear optimization problems. Our multistart algorithm considers all previously evaluated points when determining where to start or continue a local optimization run. Theoretical results show that when there are finitely many minima, the algorithm almost surely starts a finite number of local optimization runs and identifies every minimum. The algorithm is applicable to general optimization settings, but our numerical results focus on the case when derivatives are unavailable. In numerical tests, a Python implementation of the algorithm is shown to yield good approximations of many minima (including a global minimum), and this ability is shown to scale well with additional resources. Our implementation’s time to solution is shown also to scale well even when the time to perform the function evaluation is highly variable. An implementation of the algorithm is available in the libEnsemble library at https://github.com/Libensemble/libensemble. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1007/s12532-017-0131-4 | Math. Program. Comput. |
Keywords | Field | DocType |
Parallel optimization algorithms,Multistart,Global optimization,Derivative-free optimization,Concurrent function evaluations,65K05,90C26,90C30,90C56 | Derivative-free optimization,Mathematical optimization,Global optimization,Nonlinear programming,Maxima and minima,Solver,Almost surely,Local search (optimization),Mathematics,Python (programming language) | Journal |
Volume | Issue | ISSN |
10 | 3 | 1867-2949 |
Citations | PageRank | References |
1 | 0.37 | 14 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jeffrey Larson | 1 | 32 | 5.46 |
Stefan M. Wild | 2 | 481 | 31.93 |