Title
Inertial Proximal Gradient Methods With Bregman Regularization For A Class Of Nonconvex Optimization Problems
Abstract
This paper proposes an inertial Bregman proximal gradient method for minimizing the sum of two possibly nonconvex functions. This method includes two different inertial steps and adopts the Bregman regularization in solving the subproblem. Under some general parameter constraints, we prove the subsequential convergence that each generated sequence converges to the stationary point of the considered problem. To overcome the parameter constraints, we further propose a nonmonotone line search strategy to make the parameter selections more flexible. The subsequential convergence of the proposed method with line search is established. When the line search is monotone, we prove the stronger global convergence and linear convergence rate under Kurdyka-Lojasiewicz framework. Moreover, numerical results on SCAD and MCP nonconvex penalty problems are reported to demonstrate the effectiveness and superiority of the proposed methods and line search strategy.
Year
DOI
Venue
2021
10.1007/s10898-020-00943-7
JOURNAL OF GLOBAL OPTIMIZATION
Keywords
DocType
Volume
Nonconvex, Nonsmooth, Inertial proximal gradient method, Bregman regularization, Kurdyka-Lojasiewicz property, Global convergence
Journal
79
Issue
ISSN
Citations 
3
0925-5001
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Zhongming Wu112.05
Chongshou Li2154.94
Min Li38210.65
Andrew Lim493789.78