Title
Finding a low-rank basis in a matrix subspace.
Abstract
For a given matrix subspace, how can we find a basis that consists of low-rank matrices? This is a generalization of the sparse vector problem. It turns out that when the subspace is spanned by rank-1 matrices, the matrices can be obtained by the tensor CP decomposition. For the higher rank case, the situation is not as straightforward. In this work we present an algorithm based on a greedy process applicable to higher rank problems. Our algorithm first estimates the minimum rank by applying soft singular value thresholding to a nuclear norm relaxation, and then computes a matrix with that rank using the method of alternating projections. We provide local convergence results, and compare our algorithm with several alternative approaches. Applications include data compression beyond the classical truncated SVD, computing accurate eigenvectors of a near-multiple eigenvalue, image separation and graph Laplacian eigenproblems.
Year
DOI
Venue
2015
10.1007/s10107-016-1042-2
Mathematical Programming: Series A and B
Keywords
Field
DocType
Low-rank matrix subspace, $$\ell ^1$$ℓ1 relaxation, Alternating projections, Singular value thresholding, Matrix compression, 90C26 Nonconvex programming, global optimization
Rank (linear algebra),Discrete mathematics,Singular value decomposition,Laplacian matrix,Mathematical optimization,Subspace topology,Matrix (mathematics),Matrix norm,Low-rank approximation,Eigenvalues and eigenvectors,Mathematics
Journal
Volume
Issue
ISSN
162
1-2
0025-5610
Citations 
PageRank 
References 
1
0.37
36
Authors
3
Name
Order
Citations
PageRank
Yuji Nakatsukasa19717.74
Tasuku Soma221.07
André Uschmajew31359.34