Abstract | ||
---|---|---|
We consider the minimization of a function defined on a Riemannian manifold M only accessible through unbiased estimates of its gradients. We develop a geometric framework to transform a sequence of slowly converging iterates generated from stochastic gradient descent (SGD) on M to an averaged iterate sequence with a robust and fast O(1/n) convergence rate. We then present an application of our framework to geodesically-strongly-convex (and possibly Euclidean non-convex) problems. Finally, we show how these ideas apply to the case of streaming k-PCA, where we show how to accelerate the slow rate of the randomized power method (without requiring knowledge of the eigengap) into a robust algorithm achieving the optimal rate of convergence. |
Year | Venue | DocType |
---|---|---|
2018 | COLT | Conference |
Volume | Citations | PageRank |
abs/1802.09128 | 2 | 0.37 |
References | Authors | |
15 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Nilesh Tripuraneni | 1 | 18 | 4.22 |
Nicolas Flammarion | 2 | 135 | 9.77 |
Francis Bach | 3 | 11490 | 622.29 |
Michael I. Jordan | 4 | 31220 | 3640.80 |