### Abstract

The standard linear regression (SLR) problem is to recover a vector x^{0} from noisy linear observations y = Ax^{0} + w. The approximate message passing (AMP) algorithm recently proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its periteration behavior is rigorously characterized by a scalar stateevolution whose fixed points, when unique, are Bayes optimal. AMP, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a 'vector AMP' (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMP's state evolution are consistent with the replica prediction of the minimum mean-squared error recently derived by Tulino, Caire, Verdú, and Shamai.

Original language | English (US) |
---|---|

Title of host publication | 2017 IEEE International Symposium on Information Theory, ISIT 2017 |

Publisher | Institute of Electrical and Electronics Engineers Inc. |

Pages | 1588-1592 |

Number of pages | 5 |

ISBN (Electronic) | 9781509040964 |

DOIs | |

State | Published - Aug 9 2017 |

Event | 2017 IEEE International Symposium on Information Theory, ISIT 2017 - Aachen, Germany Duration: Jun 25 2017 → Jun 30 2017 |

### Other

Other | 2017 IEEE International Symposium on Information Theory, ISIT 2017 |
---|---|

Country | Germany |

City | Aachen |

Period | 6/25/17 → 6/30/17 |

### Fingerprint

### Keywords

- Belief propagation
- Compressive sensing
- Inference algorithms
- Message passing
- Random matrices

### ASJC Scopus subject areas

- Theoretical Computer Science
- Information Systems
- Modeling and Simulation
- Applied Mathematics

### Cite this

*2017 IEEE International Symposium on Information Theory, ISIT 2017*(pp. 1588-1592). [8006797] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ISIT.2017.8006797

**Vector approximate message passing.** / Rangan, Sundeep; Schniter, Philip; Fletcher, Alyson K.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*2017 IEEE International Symposium on Information Theory, ISIT 2017.*, 8006797, Institute of Electrical and Electronics Engineers Inc., pp. 1588-1592, 2017 IEEE International Symposium on Information Theory, ISIT 2017, Aachen, Germany, 6/25/17. https://doi.org/10.1109/ISIT.2017.8006797

}

TY - GEN

T1 - Vector approximate message passing

AU - Rangan, Sundeep

AU - Schniter, Philip

AU - Fletcher, Alyson K.

PY - 2017/8/9

Y1 - 2017/8/9

N2 - The standard linear regression (SLR) problem is to recover a vector x0 from noisy linear observations y = Ax0 + w. The approximate message passing (AMP) algorithm recently proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its periteration behavior is rigorously characterized by a scalar stateevolution whose fixed points, when unique, are Bayes optimal. AMP, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a 'vector AMP' (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMP's state evolution are consistent with the replica prediction of the minimum mean-squared error recently derived by Tulino, Caire, Verdú, and Shamai.

AB - The standard linear regression (SLR) problem is to recover a vector x0 from noisy linear observations y = Ax0 + w. The approximate message passing (AMP) algorithm recently proposed by Donoho, Maleki, and Montanari is a computationally efficient iterative approach to SLR that has a remarkable property: for large i.i.d. sub-Gaussian matrices A, its periteration behavior is rigorously characterized by a scalar stateevolution whose fixed points, when unique, are Bayes optimal. AMP, however, is fragile in that even small deviations from the i.i.d. sub-Gaussian model can cause the algorithm to diverge. This paper considers a 'vector AMP' (VAMP) algorithm and shows that VAMP has a rigorous scalar state-evolution that holds under a much broader class of large random matrices A: those that are right-rotationally invariant. After performing an initial singular value decomposition (SVD) of A, the per-iteration complexity of VAMP is similar to that of AMP. In addition, the fixed points of VAMP's state evolution are consistent with the replica prediction of the minimum mean-squared error recently derived by Tulino, Caire, Verdú, and Shamai.

KW - Belief propagation

KW - Compressive sensing

KW - Inference algorithms

KW - Message passing

KW - Random matrices

UR - http://www.scopus.com/inward/record.url?scp=85034036614&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85034036614&partnerID=8YFLogxK

U2 - 10.1109/ISIT.2017.8006797

DO - 10.1109/ISIT.2017.8006797

M3 - Conference contribution

AN - SCOPUS:85034036614

SP - 1588

EP - 1592

BT - 2017 IEEE International Symposium on Information Theory, ISIT 2017

PB - Institute of Electrical and Electronics Engineers Inc.

ER -