Abstract | ||
---|---|---|
The standard linear regression (SLR) problem is to recover a vector x 0 from noisy linear observations y = Ax 0 + w. The approximate message passing (AMP) algorithm recently proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its periteration behavior is rigorously characterized by a scalar stateevolution whose fixed points, when unique, are Bayes optimal. AMP, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a “vector AMP” (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMPu0027s state evolution are consistent with the replica prediction of the minimum mean-squared error recently derived by Tulino, Caire, Verdu, and Shamai. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1109/TIT.2019.2916359 | international symposium on information theory |
Keywords | DocType | Volume |
Approximation algorithms,Covariance matrices,Message passing,Signal processing algorithms,Minimization,Standards,Linear regression | Conference | abs/1610.03082 |
Issue | ISSN | Citations |
10 | 0018-9448 | 40 |
PageRank | References | Authors |
1.26 | 19 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sundeep Rangan | 1 | 3101 | 163.90 |
Philip Schniter | 2 | 1620 | 93.74 |
Alyson K. Fletcher | 3 | 552 | 41.10 |