Title
Convergence rates of Forward--Douglas--Rachford splitting method
Abstract
Over the past decades, operator splitting methods have become ubiquitous for non-smooth optimization owing to their simplicity and efficiency. In this paper, we consider the Forward–Douglas–Rachford splitting method and study both global and local convergence rates of this method. For the global rate, we establish a sublinear convergence rate in terms of a Bregman divergence suitably designed for the objective function. Moreover, when specializing to the Forward–Backward splitting, we prove a stronger convergence rate result for the objective function value. Then locally, based on the assumption that the non-smooth part of the optimization problem is partly smooth, we establish local linear convergence of the method. More precisely, we show that the sequence generated by Forward–Douglas–Rachford first (i) identifies a smooth manifold in a finite number of iteration and then (ii) enters a local linear convergence regime, which is for instance characterized in terms of the structure of the underlying active smooth manifold. To exemplify the usefulness of the obtained result, we consider several concrete numerical experiments arising from applicative fields including, for instance, signal/image processing, inverse problems and machine learning.
Year
DOI
Venue
2019
10.17863/CAM.39021
Journal of Optimization Theory and Applications
Keywords
Field
DocType
Forward–Douglas–Rachford, Forward–Backward, Bregman distance, Partial smoothness, Finite identification, Local linear convergence, 49J52, 65K05, 65K10, 90C25
Convergence (routing),Applied mathematics,Mathematical optimization,Finite set,Rate of convergence,Local convergence,Inverse problem,Bregman divergence,Optimization problem,Manifold,Mathematics
Journal
Volume
Issue
ISSN
182
2
1573-2878
Citations 
PageRank 
References 
2
0.38
6
Authors
3
Name
Order
Citations
PageRank
Cesare Molinari120.38
Jingwei Liang2527.41
Jalal Fadili3118480.08