Title
Non-smooth Non-convex Bregman Minimization: Unification and new Algorithms.
Abstract
We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, along which the next iterate is found. Complemented with an Armijo-like line search strategy, we obtain a flexible algorithm for which we prove (subsequential) convergence to a stationary point under weak assumptions on the growth of the model function error. Special instances of the algorithm with a Euclidean distance function are, for example, gradient descent, forward–backward splitting, ProxDescent, without the common requirement of a “Lipschitz continuous gradient”. In addition, we consider a broad class of Bregman distance functions (generated by Legendre functions), replacing the Euclidean distance. The algorithm has a wide range of applications including many linear and nonlinear inverse problems in signal/image processing and machine learning.
Year
DOI
Venue
2017
10.1007/s10957-018-01452-0
Journal of Optimization Theory and Applications
Keywords
Field
DocType
Bregman minimization,Legendre function,Model function,Growth function,Non-convex non-smooth,Abstract algorithm,49J52,65K05,65K10,90C26
Stochastic gradient descent,Gradient descent,Mathematical optimization,Euclidean distance,Proximal Gradient Methods,Algorithm,Descent direction,Stationary point,Bregman divergence,Lipschitz continuity,Mathematics
Journal
Volume
Issue
ISSN
abs/1707.02278
1
0022-3239
Citations 
PageRank 
References 
5
0.39
16
Authors
3
Name
Order
Citations
PageRank
Peter Ochs11829.43
Jalal Fadili2118480.08
Thomas Brox37866327.52