Abstract | ||
---|---|---|
In many optimization problems arising from scientific, engineering and artificial intelligence applications, objective and constraint functions are available only as the output of a black-box or simulation oracle that does not provide derivative information. Such settings necessitate the use of methods for derivative-free, or zeroth-order, optimization. We provide a review and perspectives on developments in these methods, with an emphasis on highlighting recent developments and on unifying treatment of such problems in the non-linear optimization and machine learning literature. We categorize methods based on assumed properties of the black-box functions, as well as features of the methods. We first overview the primary setting of deterministic methods applied to unconstrained, non-convex optimization problems where the objective function is defined by a deterministic black-box oracle. We then discuss developments in randomized methods, methods that assume some additional structure about the objective (including convexity, separability and general non-smooth compositions), methods for problems where the output of the black-box oracle is stochastic, and methods for handling different types of constraints. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1017/S0962492919000060 | Acta Numer. |
DocType | Volume | ISSN |
Journal | 28 | 0962-4929 |
Citations | PageRank | References |
11 | 0.76 | 0 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jeffrey Larson | 1 | 32 | 5.46 |
Matt Menickelly | 2 | 14 | 2.85 |
Stefan M. Wild | 3 | 481 | 31.93 |