Title
An adaptive gradient sampling algorithm for non-smooth optimization
Abstract
We present an algorithm for the minimization of f:ℝn→ℝ, assumed to be locally Lipschitz and continuously differentiable in an open dense subset 𝒟 of ℝn. The objective f may be non-smooth and/or non-convex. The method is based on the gradient sampling GS algorithm of Burke et al. [A robust gradient sampling algorithm for nonsmooth, nonconvex optimization, SIAM J. Optim. 15 2005, pp. 751–779]. It differs, however, from previously proposed versions of GS in that it is variable-metric and only O1 not On gradient evaluations are required per iteration. Numerical experiments illustrate that the algorithm is more efficient than GS in that it consistently makes more progress towards a solution within a given number of gradient evaluations. In addition, the adaptive sampling procedure allows for warm-starting of the quadratic subproblem solver so that the average number of subproblem iterations per nonlinear iteration is also consistently reduced. Global convergence of the algorithm is proved assuming that the Hessian approximations are positive definite and bounded, an assumption shown to be true for the proposed Hessian approximation updating strategies.
Year
DOI
Venue
2013
10.1080/10556788.2012.714781
Optimization Methods and Software
Keywords
Field
DocType
robust gradient,adaptive sampling procedure,adaptive gradient,gradient evaluation,quadratic subproblem solver,proposed hessian approximation,average number,hessian approximation,non-smooth optimization,subproblem iteration,gs algorithm,nonlinear iteration,quadratic optimization
Gradient method,Discrete mathematics,Random search,Mathematical optimization,Gradient descent,Hessian matrix,Algorithm,Nonlinear conjugate gradient method,Quadratic programming,Random optimization,Broyden–Fletcher–Goldfarb–Shanno algorithm,Mathematics
Journal
Volume
Issue
ISSN
28
6
1055-6788
Citations 
PageRank 
References 
9
0.52
11
Authors
2
Name
Order
Citations
PageRank
Frank E. Curtis143225.71
Xiaocun Que2130.90