Approximate message passing with consistent parameter estimation and applications to sparse learning

Ulugbek S. Kamilov, Sundeep Rangan, Alyson K. Fletcher, Michael Unser

Research output: Contribution to journalArticle

Abstract

We consider the estimation of an independent and identically distributed (i.i.d.) (possibly non-Gaussian) vector {\bf x}\in{\BBR}{n} from measurements {\bf y}\in{\BBR}{m} obtained by a general cascade model consisting of a known linear transform followed by a probabilistic componentwise (possibly nonlinear) measurement channel. A novel method, called adaptive generalized approximate message passing (adaptive GAMP) is presented. It enables the joint learning of the statistics of the prior and measurement channel along with estimation of the unknown vector {\bf x}. We prove that, for large i.i.d. Gaussian transform matrices, the asymptotic componentwise behavior of the adaptive GAMP is predicted by a simple set of scalar state evolution equations. In addition, we show that the adaptive GAMP yields asymptotically consistent parameter estimates, when a certain maximum-likelihood estimation can be performed in each step. This implies that the algorithm achieves a reconstruction quality equivalent to the oracle algorithm that knows the correct parameter values. Remarkably, this result applies to essentially arbitrary parametrizations of the unknown distributions, including nonlinear and non-Gaussian ones. The adaptive GAMP methodology thus provides a systematic, general and computationally efficient method applicable to a large range of linear-nonlinear models with provable guarantees.

Original languageEnglish (US)
Article number6775335
Pages (from-to)2969-2985
Number of pages17
JournalIEEE Transactions on Information Theory
Volume60
Issue number5
DOIs
StatePublished - 2014

Fingerprint

Message passing
Parameter estimation
learning
non-linear model
linear model
Maximum likelihood estimation
guarantee
reconstruction
statistics
methodology
Statistics

Keywords

  • Approximate message passing
  • belief propagation
  • compressive sensing
  • parameter estimation

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this

Approximate message passing with consistent parameter estimation and applications to sparse learning. / Kamilov, Ulugbek S.; Rangan, Sundeep; Fletcher, Alyson K.; Unser, Michael.

In: IEEE Transactions on Information Theory, Vol. 60, No. 5, 6775335, 2014, p. 2969-2985.

Research output: Contribution to journalArticle

Kamilov, Ulugbek S. ; Rangan, Sundeep ; Fletcher, Alyson K. ; Unser, Michael. / Approximate message passing with consistent parameter estimation and applications to sparse learning. In: IEEE Transactions on Information Theory. 2014 ; Vol. 60, No. 5. pp. 2969-2985.
@article{f3e8c717c3374c458b971bd1b12c3183,
title = "Approximate message passing with consistent parameter estimation and applications to sparse learning",
abstract = "We consider the estimation of an independent and identically distributed (i.i.d.) (possibly non-Gaussian) vector {\bf x}\in{\BBR}{n} from measurements {\bf y}\in{\BBR}{m} obtained by a general cascade model consisting of a known linear transform followed by a probabilistic componentwise (possibly nonlinear) measurement channel. A novel method, called adaptive generalized approximate message passing (adaptive GAMP) is presented. It enables the joint learning of the statistics of the prior and measurement channel along with estimation of the unknown vector {\bf x}. We prove that, for large i.i.d. Gaussian transform matrices, the asymptotic componentwise behavior of the adaptive GAMP is predicted by a simple set of scalar state evolution equations. In addition, we show that the adaptive GAMP yields asymptotically consistent parameter estimates, when a certain maximum-likelihood estimation can be performed in each step. This implies that the algorithm achieves a reconstruction quality equivalent to the oracle algorithm that knows the correct parameter values. Remarkably, this result applies to essentially arbitrary parametrizations of the unknown distributions, including nonlinear and non-Gaussian ones. The adaptive GAMP methodology thus provides a systematic, general and computationally efficient method applicable to a large range of linear-nonlinear models with provable guarantees.",
keywords = "Approximate message passing, belief propagation, compressive sensing, parameter estimation",
author = "Kamilov, {Ulugbek S.} and Sundeep Rangan and Fletcher, {Alyson K.} and Michael Unser",
year = "2014",
doi = "10.1109/TIT.2014.2309005",
language = "English (US)",
volume = "60",
pages = "2969--2985",
journal = "IEEE Transactions on Information Theory",
issn = "0018-9448",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "5",

}

TY - JOUR

T1 - Approximate message passing with consistent parameter estimation and applications to sparse learning

AU - Kamilov, Ulugbek S.

AU - Rangan, Sundeep

AU - Fletcher, Alyson K.

AU - Unser, Michael

PY - 2014

Y1 - 2014

N2 - We consider the estimation of an independent and identically distributed (i.i.d.) (possibly non-Gaussian) vector {\bf x}\in{\BBR}{n} from measurements {\bf y}\in{\BBR}{m} obtained by a general cascade model consisting of a known linear transform followed by a probabilistic componentwise (possibly nonlinear) measurement channel. A novel method, called adaptive generalized approximate message passing (adaptive GAMP) is presented. It enables the joint learning of the statistics of the prior and measurement channel along with estimation of the unknown vector {\bf x}. We prove that, for large i.i.d. Gaussian transform matrices, the asymptotic componentwise behavior of the adaptive GAMP is predicted by a simple set of scalar state evolution equations. In addition, we show that the adaptive GAMP yields asymptotically consistent parameter estimates, when a certain maximum-likelihood estimation can be performed in each step. This implies that the algorithm achieves a reconstruction quality equivalent to the oracle algorithm that knows the correct parameter values. Remarkably, this result applies to essentially arbitrary parametrizations of the unknown distributions, including nonlinear and non-Gaussian ones. The adaptive GAMP methodology thus provides a systematic, general and computationally efficient method applicable to a large range of linear-nonlinear models with provable guarantees.

AB - We consider the estimation of an independent and identically distributed (i.i.d.) (possibly non-Gaussian) vector {\bf x}\in{\BBR}{n} from measurements {\bf y}\in{\BBR}{m} obtained by a general cascade model consisting of a known linear transform followed by a probabilistic componentwise (possibly nonlinear) measurement channel. A novel method, called adaptive generalized approximate message passing (adaptive GAMP) is presented. It enables the joint learning of the statistics of the prior and measurement channel along with estimation of the unknown vector {\bf x}. We prove that, for large i.i.d. Gaussian transform matrices, the asymptotic componentwise behavior of the adaptive GAMP is predicted by a simple set of scalar state evolution equations. In addition, we show that the adaptive GAMP yields asymptotically consistent parameter estimates, when a certain maximum-likelihood estimation can be performed in each step. This implies that the algorithm achieves a reconstruction quality equivalent to the oracle algorithm that knows the correct parameter values. Remarkably, this result applies to essentially arbitrary parametrizations of the unknown distributions, including nonlinear and non-Gaussian ones. The adaptive GAMP methodology thus provides a systematic, general and computationally efficient method applicable to a large range of linear-nonlinear models with provable guarantees.

KW - Approximate message passing

KW - belief propagation

KW - compressive sensing

KW - parameter estimation

UR - http://www.scopus.com/inward/record.url?scp=84899649517&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84899649517&partnerID=8YFLogxK

U2 - 10.1109/TIT.2014.2309005

DO - 10.1109/TIT.2014.2309005

M3 - Article

VL - 60

SP - 2969

EP - 2985

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 5

M1 - 6775335

ER -