Abstract | ||
---|---|---|
The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a "vector AMP" (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1109/ACSSC.2016.7869633 | 2016 50TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS |
DocType | Volume | ISSN |
Conference | abs/1612.01186 | 1058-6393 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Philip Schniter | 1 | 1620 | 93.74 |
Sundeep Rangan | 2 | 3101 | 163.90 |
Alyson K. Fletcher | 3 | 552 | 41.10 |