Vector approximate message passing for the generalized linear model

Philip Schniter, Sundeep Rangan, Alyson K. Fletcher

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.

Original languageEnglish (US)
Title of host publicationConference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016
PublisherIEEE Computer Society
Pages1525-1529
Number of pages5
ISBN (Electronic)9781538639542
DOIs
StatePublished - Mar 1 2017
Event50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016 - Pacific Grove, United States
Duration: Nov 6 2016Nov 9 2016

Other

Other50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016
CountryUnited States
CityPacific Grove
Period11/6/1611/9/16

Fingerprint

Message passing
Compressed sensing
Photons
Damping
Imaging techniques
Experiments

ASJC Scopus subject areas

  • Signal Processing
  • Computer Networks and Communications

Cite this

Schniter, P., Rangan, S., & Fletcher, A. K. (2017). Vector approximate message passing for the generalized linear model. In Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016 (pp. 1525-1529). [7869633] IEEE Computer Society. https://doi.org/10.1109/ACSSC.2016.7869633

Vector approximate message passing for the generalized linear model. / Schniter, Philip; Rangan, Sundeep; Fletcher, Alyson K.

Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016. IEEE Computer Society, 2017. p. 1525-1529 7869633.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Schniter, P, Rangan, S & Fletcher, AK 2017, Vector approximate message passing for the generalized linear model. in Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016., 7869633, IEEE Computer Society, pp. 1525-1529, 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016, Pacific Grove, United States, 11/6/16. https://doi.org/10.1109/ACSSC.2016.7869633
Schniter P, Rangan S, Fletcher AK. Vector approximate message passing for the generalized linear model. In Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016. IEEE Computer Society. 2017. p. 1525-1529. 7869633 https://doi.org/10.1109/ACSSC.2016.7869633
Schniter, Philip ; Rangan, Sundeep ; Fletcher, Alyson K. / Vector approximate message passing for the generalized linear model. Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016. IEEE Computer Society, 2017. pp. 1525-1529
@inproceedings{fdfaec47f79a47aea7cac6646ee1368a,
title = "Vector approximate message passing for the generalized linear model",
abstract = "The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.",
author = "Philip Schniter and Sundeep Rangan and Fletcher, {Alyson K.}",
year = "2017",
month = "3",
day = "1",
doi = "10.1109/ACSSC.2016.7869633",
language = "English (US)",
pages = "1525--1529",
booktitle = "Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016",
publisher = "IEEE Computer Society",
address = "United States",

}

TY - GEN

T1 - Vector approximate message passing for the generalized linear model

AU - Schniter, Philip

AU - Rangan, Sundeep

AU - Fletcher, Alyson K.

PY - 2017/3/1

Y1 - 2017/3/1

N2 - The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.

AB - The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a 'vector AMP' (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.

UR - http://www.scopus.com/inward/record.url?scp=85016260654&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85016260654&partnerID=8YFLogxK

U2 - 10.1109/ACSSC.2016.7869633

DO - 10.1109/ACSSC.2016.7869633

M3 - Conference contribution

SP - 1525

EP - 1529

BT - Conference Record of the 50th Asilomar Conference on Signals, Systems and Computers, ACSSC 2016

PB - IEEE Computer Society

ER -