Title | ||
---|---|---|
Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD. |
Abstract | ||
---|---|---|
We study Stochastic Gradient Descent (SGD) with diminishing step sizes for convex objective functions. We introduce a definitional framework and theory that defines and characterizes a core property, called curvature, of convex objective functions. In terms of curvature we can derive a new inequality that can be used to compute an optimal sequence of diminishing step sizes by solving a differential equation. Our exact solutions confirm known results in literature and allows us to fully characterize a new regularizer with its corresponding expected convergence rates. |
Year | Venue | Field |
---|---|---|
2018 | international conference on machine learning | Convergence (routing),Differential equation,Mathematical optimization,Stochastic gradient descent,Curvature,Regular polygon,Mathematics |
DocType | Volume | ISSN |
Journal | abs/1810.04100 | Proceedings of the 36th International Conference on Machine
Learning, PMLR 97, 2019 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Marten Van Dijk | 1 | 2875 | 242.07 |
Lam M. Nguyen | 2 | 43 | 8.95 |
Phuong Ha Nguyen | 3 | 84 | 12.41 |
Dzung T. Phan | 4 | 61 | 10.32 |