Rigorous dynamics and consistent estimation in arbitrarily conditioned linear systems

Alyson K. Fletcher, Mojtaba Sahraee-Ardakan, Sundeep Rangan, Philip Schniter

Research output: Contribution to journalConference article

Abstract

We consider the problem of estimating a random vector x from noisy linear measurements y = Ax + w in the setting where parameters θ on the distribution of x and w must be learned in addition to the vector x. This problem arises in a wide range of statistical learning and linear inverse problems. Our main contribution shows that a computationally simple iterative message passing algorithm can provably obtain asymptotically consistent estimates in a certain high-dimensional large system limit (LSL) under very general parametrizations. Importantly, this LSL applies to all right-rotationally random A - a much larger class of matrices than i.i.d. sub-Gaussian matrices to which many past message passing approaches are restricted. In addition, a simple testable condition is provided in which the mean square error (MSE) on the vector x matches the Bayes optimal MSE predicted by the replica method. The proposed algorithm uses a combination of Expectation-Maximization (EM) with a recently-developed Vector Approximate Message Passing (VAMP) technique. We develop an analysis framework that shows that the parameter estimates in each iteration of the algorithm converge to deterministic limits that can be precisely predicted by a simple set of state evolution (SE) equations. The SE equations, which extends those of VAMP without parameter adaptation, depend only on the initial parameter estimates and the statistical properties of the problem and can be used to predict consistency and precisely characterize other performance measures of the method.

Original languageEnglish (US)
Pages (from-to)2546-2555
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2017-December
StatePublished - Jan 1 2017
Event31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
Duration: Dec 4 2017Dec 9 2017

Fingerprint

Linear systems
Message passing
Mean square error
Inverse problems

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Rigorous dynamics and consistent estimation in arbitrarily conditioned linear systems. / Fletcher, Alyson K.; Sahraee-Ardakan, Mojtaba; Rangan, Sundeep; Schniter, Philip.

In: Advances in Neural Information Processing Systems, Vol. 2017-December, 01.01.2017, p. 2546-2555.

Research output: Contribution to journalConference article

Fletcher, Alyson K. ; Sahraee-Ardakan, Mojtaba ; Rangan, Sundeep ; Schniter, Philip. / Rigorous dynamics and consistent estimation in arbitrarily conditioned linear systems. In: Advances in Neural Information Processing Systems. 2017 ; Vol. 2017-December. pp. 2546-2555.
@article{9b5c3e714ee24fa98858e38d7fe93cc2,
title = "Rigorous dynamics and consistent estimation in arbitrarily conditioned linear systems",
abstract = "We consider the problem of estimating a random vector x from noisy linear measurements y = Ax + w in the setting where parameters θ on the distribution of x and w must be learned in addition to the vector x. This problem arises in a wide range of statistical learning and linear inverse problems. Our main contribution shows that a computationally simple iterative message passing algorithm can provably obtain asymptotically consistent estimates in a certain high-dimensional large system limit (LSL) under very general parametrizations. Importantly, this LSL applies to all right-rotationally random A - a much larger class of matrices than i.i.d. sub-Gaussian matrices to which many past message passing approaches are restricted. In addition, a simple testable condition is provided in which the mean square error (MSE) on the vector x matches the Bayes optimal MSE predicted by the replica method. The proposed algorithm uses a combination of Expectation-Maximization (EM) with a recently-developed Vector Approximate Message Passing (VAMP) technique. We develop an analysis framework that shows that the parameter estimates in each iteration of the algorithm converge to deterministic limits that can be precisely predicted by a simple set of state evolution (SE) equations. The SE equations, which extends those of VAMP without parameter adaptation, depend only on the initial parameter estimates and the statistical properties of the problem and can be used to predict consistency and precisely characterize other performance measures of the method.",
author = "Fletcher, {Alyson K.} and Mojtaba Sahraee-Ardakan and Sundeep Rangan and Philip Schniter",
year = "2017",
month = "1",
day = "1",
language = "English (US)",
volume = "2017-December",
pages = "2546--2555",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Rigorous dynamics and consistent estimation in arbitrarily conditioned linear systems

AU - Fletcher, Alyson K.

AU - Sahraee-Ardakan, Mojtaba

AU - Rangan, Sundeep

AU - Schniter, Philip

PY - 2017/1/1

Y1 - 2017/1/1

N2 - We consider the problem of estimating a random vector x from noisy linear measurements y = Ax + w in the setting where parameters θ on the distribution of x and w must be learned in addition to the vector x. This problem arises in a wide range of statistical learning and linear inverse problems. Our main contribution shows that a computationally simple iterative message passing algorithm can provably obtain asymptotically consistent estimates in a certain high-dimensional large system limit (LSL) under very general parametrizations. Importantly, this LSL applies to all right-rotationally random A - a much larger class of matrices than i.i.d. sub-Gaussian matrices to which many past message passing approaches are restricted. In addition, a simple testable condition is provided in which the mean square error (MSE) on the vector x matches the Bayes optimal MSE predicted by the replica method. The proposed algorithm uses a combination of Expectation-Maximization (EM) with a recently-developed Vector Approximate Message Passing (VAMP) technique. We develop an analysis framework that shows that the parameter estimates in each iteration of the algorithm converge to deterministic limits that can be precisely predicted by a simple set of state evolution (SE) equations. The SE equations, which extends those of VAMP without parameter adaptation, depend only on the initial parameter estimates and the statistical properties of the problem and can be used to predict consistency and precisely characterize other performance measures of the method.

AB - We consider the problem of estimating a random vector x from noisy linear measurements y = Ax + w in the setting where parameters θ on the distribution of x and w must be learned in addition to the vector x. This problem arises in a wide range of statistical learning and linear inverse problems. Our main contribution shows that a computationally simple iterative message passing algorithm can provably obtain asymptotically consistent estimates in a certain high-dimensional large system limit (LSL) under very general parametrizations. Importantly, this LSL applies to all right-rotationally random A - a much larger class of matrices than i.i.d. sub-Gaussian matrices to which many past message passing approaches are restricted. In addition, a simple testable condition is provided in which the mean square error (MSE) on the vector x matches the Bayes optimal MSE predicted by the replica method. The proposed algorithm uses a combination of Expectation-Maximization (EM) with a recently-developed Vector Approximate Message Passing (VAMP) technique. We develop an analysis framework that shows that the parameter estimates in each iteration of the algorithm converge to deterministic limits that can be precisely predicted by a simple set of state evolution (SE) equations. The SE equations, which extends those of VAMP without parameter adaptation, depend only on the initial parameter estimates and the statistical properties of the problem and can be used to predict consistency and precisely characterize other performance measures of the method.

UR - http://www.scopus.com/inward/record.url?scp=85047020210&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85047020210&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85047020210

VL - 2017-December

SP - 2546

EP - 2555

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -