Title
Averaging Stochastic Gradient Descent on Riemannian Manifolds.
Abstract
We consider the minimization of a function defined on a Riemannian manifold M only accessible through unbiased estimates of its gradients. We develop a geometric framework to transform a sequence of slowly converging iterates generated from stochastic gradient descent (SGD) on M to an averaged iterate sequence with a robust and fast O(1/n) convergence rate. We then present an application of our framework to geodesically-strongly-convex (and possibly Euclidean non-convex) problems. Finally, we show how these ideas apply to the case of streaming k-PCA, where we show how to accelerate the slow rate of the randomized power method (without requiring knowledge of the eigengap) into a robust algorithm achieving the optimal rate of convergence.
Year
Venue
DocType
2018
COLT
Conference
Volume
Citations 
PageRank 
abs/1802.09128
2
0.37
References 
Authors
15
4
Name
Order
Citations
PageRank
Nilesh Tripuraneni1184.22
Nicolas Flammarion21359.77
Francis Bach311490622.29
Michael I. Jordan4312203640.80