### Abstract

The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.

Original language | English (US) |
---|---|

Title of host publication | Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016 |

Publisher | IEEE Computer Society |

Pages | 1525-1529 |

Number of pages | 5 |

ISBN (Electronic) | 9781538639542 |

DOIs | |

State | Published - Mar 1 2017 |

Event | 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016 - Pacific Grove, United States Duration: Nov 6 2016 → Nov 9 2016 |

### Other

Other | 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016 |
---|---|

Country | United States |

City | Pacific Grove |

Period | 11/6/16 → 11/9/16 |

### Fingerprint

### ASJC Scopus subject areas

- Signal Processing
- Computer Networks and Communications

### Cite this

*Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016*(pp. 1525-1529). [7869633] IEEE Computer Society. https://doi.org/10.1109/ACSSC.2016.7869633

**Vector approximate message passing for the generalized linear model.** / Schniter, Philip; Rangan, Sundeep; Fletcher, Alyson K.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016.*, 7869633, IEEE Computer Society, pp. 1525-1529, 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016, Pacific Grove, United States, 11/6/16. https://doi.org/10.1109/ACSSC.2016.7869633

}

TY - GEN

T1 - Vector approximate message passing for the generalized linear model

AU - Schniter, Philip

AU - Rangan, Sundeep

AU - Fletcher, Alyson K.

PY - 2017/3/1

Y1 - 2017/3/1

N2 - The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.

AB - The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.

UR - http://www.scopus.com/inward/record.url?scp=85016260654&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85016260654&partnerID=8YFLogxK

U2 - 10.1109/ACSSC.2016.7869633

DO - 10.1109/ACSSC.2016.7869633

M3 - Conference contribution

SP - 1525

EP - 1529

BT - Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016

PB - IEEE Computer Society

ER -