Title
Contracting Proximal Methods For Smooth Convex Optimization
Abstract
In this paper, we propose new accelerated methods for smooth convex optimization, called contracting proximal methods. At every step of these methods, we need to minimize a contracted version of the objective function augmented by a regularization term in the form of Bregman divergence. We provide global convergence analysis for a general scheme admitting inexactness in solving the auxiliary subproblem. In the case of using for this purpose high-order tensor methods, we demonstrate an acceleration effect for both convex and uniformly convex composite objective functions. Thus, our construction explains acceleration for methods of any order starting from one. The augmentation of the number of calls of oracle due to computing the contracted proximal steps is limited by the logarithmic factor in the worst-case complexity bound.
Year
DOI
Venue
2020
10.1137/19M130769X
SIAM JOURNAL ON OPTIMIZATION
Keywords
DocType
Volume
convex optimization, proximal method, accelerated methods, global complexity bounds, high-order algorithms
Journal
30
Issue
ISSN
Citations 
4
1052-6234
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Nikita Doikov122.42
Yurii Nesterov21800168.77