Abstract | ||
---|---|---|
In this paper, we propose a framework based on sum-of-squares programming to design iterative first-order optimization algorithms for smooth and strongly convex problems. Our starting point is to develop a polynomial matrix inequality as a sufficient condition for exponential convergence of the algorithm. The entries of this matrix are polynomial functions of the unknown parameters (exponential decay rate, stepsize, momentum coefficient, etc.). We then formulate a polynomial optimization, in which the objective is to optimize the exponential decay rate over the parameters of the algorithm. Finally, we use sum-of-squares programming as a tractable relaxation of the proposed polynomial optimization problem. We illustrate the utility of the proposed framework by designing a first-order algorithm that shares the same structure as Nesterov's accelerated gradient method. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1109/CDC.2018.8618984 | 2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC) |
Field | DocType | ISSN |
Gradient method,Mathematical optimization,Polynomial matrix,Polynomial,Matrix (mathematics),Exponential decay,Convex function,Momentum,Optimization algorithm,Mathematics | Conference | 0743-1546 |
Citations | PageRank | References |
0 | 0.34 | 7 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Mahyar Fazlyab | 1 | 33 | 6.06 |
Manfred Morari | 2 | 6006 | 918.33 |
Victor M. Preciado | 3 | 205 | 29.44 |