Vector approximate message passing

Sundeep Rangan, Philip Schniter, Alyson K. Fletcher

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The standard linear regression (SLR) problem is to recover a vector x0 from noisy linear observations y = Ax0 + w. The approximate message passing (AMP) algorithm recently proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its periteration behavior is rigorously characterized by a scalar stateevolution whose fixed points, when unique, are Bayes optimal. AMP, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a 'vector AMP' (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMP's state evolution are consistent with the replica prediction of the minimum mean-squared error recently derived by Tulino, Caire, Verdú, and Shamai.

Original languageEnglish (US)
Title of host publication2017 IEEE International Symposium on Information Theory, ISIT 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1588-1592
Number of pages5
ISBN (Electronic)9781509040964
DOIs
StatePublished - Aug 9 2017
Event2017 IEEE International Symposium on Information Theory, ISIT 2017 - Aachen, Germany
Duration: Jun 25 2017Jun 30 2017

Other

Other2017 IEEE International Symposium on Information Theory, ISIT 2017
CountryGermany
CityAachen
Period6/25/176/30/17

Fingerprint

Message passing
Message Passing
Message-passing Algorithms
Approximate Algorithm
Linear regression
Fixed point
Scalar
Small Deviations
Gaussian Model
Singular value decomposition
Bayes
Replica
Diverge
Random Matrices
Mean Squared Error
Iteration
Invariant
Prediction
Standards

Keywords

  • Belief propagation
  • Compressive sensing
  • Inference algorithms
  • Message passing
  • Random matrices

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Cite this

Rangan, S., Schniter, P., & Fletcher, A. K. (2017). Vector approximate message passing. In 2017 IEEE International Symposium on Information Theory, ISIT 2017 (pp. 1588-1592). [8006797] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ISIT.2017.8006797

Vector approximate message passing. / Rangan, Sundeep; Schniter, Philip; Fletcher, Alyson K.

2017 IEEE International Symposium on Information Theory, ISIT 2017. Institute of Electrical and Electronics Engineers Inc., 2017. p. 1588-1592 8006797.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Rangan, S, Schniter, P & Fletcher, AK 2017, Vector approximate message passing. in 2017 IEEE International Symposium on Information Theory, ISIT 2017., 8006797, Institute of Electrical and Electronics Engineers Inc., pp. 1588-1592, 2017 IEEE International Symposium on Information Theory, ISIT 2017, Aachen, Germany, 6/25/17. https://doi.org/10.1109/ISIT.2017.8006797
Rangan S, Schniter P, Fletcher AK. Vector approximate message passing. In 2017 IEEE International Symposium on Information Theory, ISIT 2017. Institute of Electrical and Electronics Engineers Inc. 2017. p. 1588-1592. 8006797 https://doi.org/10.1109/ISIT.2017.8006797
Rangan, Sundeep ; Schniter, Philip ; Fletcher, Alyson K. / Vector approximate message passing. 2017 IEEE International Symposium on Information Theory, ISIT 2017. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 1588-1592
@inproceedings{398a01e74c1d484cb08c0d5c934d0bb8,
title = "Vector approximate message passing",
abstract = "The standard linear regression (SLR) problem is to recover a vector x0 from noisy linear observations y = Ax0 + w. The approximate message passing (AMP) algorithm recently proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its periteration behavior is rigorously characterized by a scalar stateevolution whose fixed points, when unique, are Bayes optimal. AMP, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a 'vector AMP' (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMP's state evolution are consistent with the replica prediction of the minimum mean-squared error recently derived by Tulino, Caire, Verd{\'u}, and Shamai.",
keywords = "Belief propagation, Compressive sensing, Inference algorithms, Message passing, Random matrices",
author = "Sundeep Rangan and Philip Schniter and Fletcher, {Alyson K.}",
year = "2017",
month = "8",
day = "9",
doi = "10.1109/ISIT.2017.8006797",
language = "English (US)",
pages = "1588--1592",
booktitle = "2017 IEEE International Symposium on Information Theory, ISIT 2017",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Vector approximate message passing

AU - Rangan, Sundeep

AU - Schniter, Philip

AU - Fletcher, Alyson K.

PY - 2017/8/9

Y1 - 2017/8/9

N2 - The standard linear regression (SLR) problem is to recover a vector x0 from noisy linear observations y = Ax0 + w. The approximate message passing (AMP) algorithm recently proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its periteration behavior is rigorously characterized by a scalar stateevolution whose fixed points, when unique, are Bayes optimal. AMP, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a 'vector AMP' (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMP's state evolution are consistent with the replica prediction of the minimum mean-squared error recently derived by Tulino, Caire, Verdú, and Shamai.

AB - The standard linear regression (SLR) problem is to recover a vector x0 from noisy linear observations y = Ax0 + w. The approximate message passing (AMP) algorithm recently proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its periteration behavior is rigorously characterized by a scalar stateevolution whose fixed points, when unique, are Bayes optimal. AMP, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a 'vector AMP' (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMP's state evolution are consistent with the replica prediction of the minimum mean-squared error recently derived by Tulino, Caire, Verdú, and Shamai.

KW - Belief propagation

KW - Compressive sensing

KW - Inference algorithms

KW - Message passing

KW - Random matrices

UR - http://www.scopus.com/inward/record.url?scp=85034036614&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85034036614&partnerID=8YFLogxK

U2 - 10.1109/ISIT.2017.8006797

DO - 10.1109/ISIT.2017.8006797

M3 - Conference contribution

SP - 1588

EP - 1592

BT - 2017 IEEE International Symposium on Information Theory, ISIT 2017

PB - Institute of Electrical and Electronics Engineers Inc.

ER -