Title
Optimal Tensor Methods in Smooth Convex and Uniformly ConvexOptimization.
Abstract
We consider convex optimization problems with the objective function having Lipshitz-continuous $p$-th order derivative, where $p\geq 1$. We propose a new tensor method, which closes the gap between the lower $\Omega\left(\e^{-\frac{2}{3p+1}} \right)$ and upper $O\left(\e^{-\frac{1}{p+1}} \right)$ iteration complexity bounds for this class of optimization problems. We also consider uniformly convex functions, and show how the proposed method can be accelerated under this additional assumption. Moreover, we introduce a $p$-th order condition number which naturally arises in the complexity analysis of tensor methods under this assumption. Finally, we make a numerical study of the proposed optimal method and show that in practice it is faster than the best known accelerated tensor method. We also compare the performance of tensor methods for $p=2$ and $p=3$ and show that the 3rd-order method is superior to the 2nd-order method in practice.
Year
Venue
Field
2019
COLT
Applied mathematics,Mathematical optimization,Tensor,Computer science,Regular polygon
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
6