Abstract | ||
---|---|---|
We establish theoretical recovery guarantees of a family of Riemannian optimization algorithms for low rank matrix recovery, which is about recovering an m x n rank r matrix from p < mn number of linear measurements. The algorithms are first interpreted as iterative hard thresholding algorithms with subspace projections. Based on this connection, we show that provided the restricted isometry constant R-3r, of the sensing operator is less than the Ck/root r, Riemannian gradient descent algorithm and a restarted variant of the Riemannian conjugate gradient algorithm are guaranteed to converge linearly to the underlying rank r matrix if they are initialized by one step hard thresholding. Empirical evaluation shows that the algorithms are able to recover a low rank matrix from nearly the minimum number of measurements necessary. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1137/15M1050525 | SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS |
Keywords | Field | DocType |
matrix recovery,low rank matrix manifold,Riemannian optimization,gradient descent and conjugate gradient descent methods,restricted isometry constant | Conjugate gradient method,Gradient descent,Combinatorics,Subspace topology,Matrix (mathematics),Mathematical analysis,Isometry,Low-rank approximation,Operator (computer programming),Thresholding,Mathematics | Journal |
Volume | Issue | ISSN |
37 | 3 | 0895-4798 |
Citations | PageRank | References |
19 | 0.68 | 28 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ke Wei | 1 | 131 | 7.79 |
Jian-Feng Cai | 2 | 2828 | 125.44 |
Tony F. Chan | 3 | 8733 | 659.77 |
Shingyu Leung | 4 | 164 | 18.35 |