Title
Adaptive Catalyst for Smooth Convex Optimization.
Abstract
In 2015 there appears a universal framework Catalyst that allows to accelerate almost arbitrary non-accelerated deterministic and randomized algorithms for smooth convex optimization problems Lin et al. (2015). This technique finds a lot of applications in Machine Learning due to the possibility to deal with sum-type target functions. The significant part of the Catalyst approach is accelerated proximal outer gradient method. This method used as an envelope for non-accelerated inner algorithm for the regularized auxiliary problem. One of the main practical problem of this approach is the selection of this regularization parameter. There exists a nice theory for that at Lin et al. (2018), but this theory required prior knowledge about the smoothness constant of the target function. In this paper, we propose an adaptive variant of Catalyst that doesn't require such information. In combination with the adaptive inner non-accelerated algorithm, we propose accelerated variants of well-known methods: steepest descent, adaptive coordinate descent, alternating minimization.
Year
DOI
Venue
2021
10.1007/978-3-030-91059-4_2
OPTIMA
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Ivanova Anastasiya100.34
Grishchenko Dmitry200.34
Gasnikov Alexander32711.58
Egor Shulgin412.03