Title
Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation From Undersampled Data.
Abstract
Subspace learning and matrix factorization problems have a great many applications in science and engineering, and efficient algorithms are critical as dataset sizes continue to grow. Many relevant problem formulations are non-convex, and in a variety of contexts it has been observed that solving the non-convex problem directly is not only efficient but reliably accurate. We discuss convergence theory for a particular method: first order incremental gradient descent constrained to the Grassmannian. The output of the algorithm is an orthonormal basis for a $d$-dimensional subspace spanned by an input streaming data matrix. We study two sampling cases: where each data vector of the streaming matrix is fully sampled, or where it is undersampled by a sampling matrix $A_tin R^{mtimes n}$ with $mll n$. We propose an adaptive stepsize scheme that depends only on the sampled data and algorithm outputs. We prove that with fully sampled data, the stepsize scheme maximizes the improvement of our convergence metric at each iteration, and this method converges from any random initialization to the true subspace, despite the non-convex formulation and orthogonality constraints. For the case of undersampled data, we establish monotonic improvement on the defined convergence metric for each iteration with high probability.
Year
Venue
Field
2016
arXiv: Numerical Analysis
Convergence (routing),Discrete mathematics,Gradient descent,Mathematical optimization,Subspace topology,Matrix (mathematics),Matrix decomposition,Orthonormal basis,Adaptive stepsize,Grassmannian,Mathematics
DocType
Volume
Citations 
Journal
abs/1610.00199
3
PageRank 
References 
Authors
0.39
13
2
Name
Order
Citations
PageRank
dejiao zhang1192.77
Laura Balzano241027.51