Title | ||
---|---|---|
Katalyst: Boosting Convex Katayusha for Non-Convex Problems with a Large Condition Number. |
Abstract | ||
---|---|---|
In this paper, we propose a new SVRG-style acceleated stochastic algorithm for solving a family of non-convex optimization problems whose objective consists of a sum of $n$ smooth functions and a non-smooth convex function. Our major goal is to improve the convergence of SVRG-style stochastic algorithms to stationary points under a setting with a large condition number $c$ - the ratio between the smoothness constant and the negative curvature constant. The proposed algorithm achieves the best known gradient complexity when $cgeq Omega(n)$, which was achieved previously by a SAGA-style accelerated stochastic algorithm. Compared with the SAGA-style accelerated stochastic algorithm, the proposed algorithm is more practical due to its low memory cost that is inherited from previous SVRG-style algorithms. Compared with previous studies on SVRG-style stochastic algorithms, our theory provides much stronger results in terms of (i) reduced gradient complexity under a large condition number; and (ii) that the convergence is proved for a sampled stagewise averaged solution that is selected from all stagewise averaged solutions with increasing sampling probabilities instead of for a uniformly sampled solutions across all iterations. |
Year | Venue | Field |
---|---|---|
2019 | International Conference on Machine Learning | Convergence (routing),Mathematical optimization,Condition number,Regular polygon,Stationary point,Convex function,Boosting (machine learning),Smoothness,Optimization problem,Mathematics |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
14 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zaiyi Chen | 1 | 35 | 4.77 |
Yi Xu | 2 | 30 | 9.92 |
Haoyuan Hu | 3 | 5 | 1.77 |
Tianbao Yang | 4 | 911 | 70.35 |