Abstract | ||
---|---|---|
We compute a sparse solution to the classical least-squares problem min"x@?Ax-b@?"2, where A is an arbitrary matrix. We describe a novel algorithm for this sparse least-squares problem. The algorithm operates as follows: first, it selects columns from A, and then solves a least-squares problem only with the selected columns. The column selection algorithm that we use is known to perform well for the well studied column subset selection problem. The contribution of this article is to show that it gives favorable results for sparse least-squares as well. Specifically, we prove that the solution vector obtained by our algorithm is close to the solution vector obtained via what is known as the ''SVD-truncated regularization approach''. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1016/j.ipl.2013.11.011 | Information Processing Letters |
Keywords | DocType | Volume |
sparse least-squares problem,classical least-squares problem min,sparse solution,sparse least-squares,least-squares problem,column selection algorithm,novel algorithm,selected column,sparse least-squares regression,solution vector,column subset selection problem,sparse approximation,regression,least squares,algorithms,regularization | Journal | 114 |
Issue | ISSN | Citations |
5 | 0020-0190 | 2 |
PageRank | References | Authors |
0.42 | 8 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Christos Boutsidis | 1 | 610 | 33.37 |
Malik Magdon-Ismail | 2 | 914 | 104.34 |