Iterative estimation of constrained rank-one matrices in noise

Sundeep Rangan, Alyson K. Fletcher

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We consider the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix. The probabilistic model can impose constraints on the factors including sparsity and positivity that arise commonly in learning problems. We propose a simple iterative procedure that reduces the problem to a sequence of scalar estimation computations. The method is similar to approximate message passing techniques based on Gaussian approximations of loopy belief propagation that have been used recently in compressed sensing. Leveraging analysis methods by Bayati and Montanari, we show that the asymptotic behavior of the estimates from the proposed iterative procedure is described by a simple scalar equivalent model, where the distribution of the estimates is identical to certain scalar estimates of the variables in Gaussian noise. Moreover, the effective Gaussian noise level is described by a set of state evolution equations. The proposed method thus provides a computationally simple and general method for rank-one estimation problems with a precise analysis in certain high-dimensional settings.

Original languageEnglish (US)
Title of host publication2012 IEEE International Symposium on Information Theory Proceedings, ISIT 2012
Pages1246-1250
Number of pages5
DOIs
StatePublished - 2012
Event2012 IEEE International Symposium on Information Theory, ISIT 2012 - Cambridge, MA, United States
Duration: Jul 1 2012Jul 6 2012

Other

Other2012 IEEE International Symposium on Information Theory, ISIT 2012
CountryUnited States
CityCambridge, MA
Period7/1/127/6/12

Fingerprint

Gaussian Noise
Compressed sensing
Message passing
Scalar
Iterative Procedure
Probabilistic Model
Estimate
Gaussian Approximation
Compressed Sensing
Belief Propagation
State Equation
Message Passing
Sparsity
Positivity
Evolution Equation
High-dimensional
Asymptotic Behavior
Statistical Models
Model

ASJC Scopus subject areas

  • Applied Mathematics
  • Modeling and Simulation
  • Theoretical Computer Science
  • Information Systems

Cite this

Rangan, S., & Fletcher, A. K. (2012). Iterative estimation of constrained rank-one matrices in noise. In 2012 IEEE International Symposium on Information Theory Proceedings, ISIT 2012 (pp. 1246-1250). [6283056] https://doi.org/10.1109/ISIT.2012.6283056

Iterative estimation of constrained rank-one matrices in noise. / Rangan, Sundeep; Fletcher, Alyson K.

2012 IEEE International Symposium on Information Theory Proceedings, ISIT 2012. 2012. p. 1246-1250 6283056.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Rangan, S & Fletcher, AK 2012, Iterative estimation of constrained rank-one matrices in noise. in 2012 IEEE International Symposium on Information Theory Proceedings, ISIT 2012., 6283056, pp. 1246-1250, 2012 IEEE International Symposium on Information Theory, ISIT 2012, Cambridge, MA, United States, 7/1/12. https://doi.org/10.1109/ISIT.2012.6283056
Rangan S, Fletcher AK. Iterative estimation of constrained rank-one matrices in noise. In 2012 IEEE International Symposium on Information Theory Proceedings, ISIT 2012. 2012. p. 1246-1250. 6283056 https://doi.org/10.1109/ISIT.2012.6283056
Rangan, Sundeep ; Fletcher, Alyson K. / Iterative estimation of constrained rank-one matrices in noise. 2012 IEEE International Symposium on Information Theory Proceedings, ISIT 2012. 2012. pp. 1246-1250
@inproceedings{186ec6d9a574449bbb97b4d35eed3450,
title = "Iterative estimation of constrained rank-one matrices in noise",
abstract = "We consider the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix. The probabilistic model can impose constraints on the factors including sparsity and positivity that arise commonly in learning problems. We propose a simple iterative procedure that reduces the problem to a sequence of scalar estimation computations. The method is similar to approximate message passing techniques based on Gaussian approximations of loopy belief propagation that have been used recently in compressed sensing. Leveraging analysis methods by Bayati and Montanari, we show that the asymptotic behavior of the estimates from the proposed iterative procedure is described by a simple scalar equivalent model, where the distribution of the estimates is identical to certain scalar estimates of the variables in Gaussian noise. Moreover, the effective Gaussian noise level is described by a set of state evolution equations. The proposed method thus provides a computationally simple and general method for rank-one estimation problems with a precise analysis in certain high-dimensional settings.",
author = "Sundeep Rangan and Fletcher, {Alyson K.}",
year = "2012",
doi = "10.1109/ISIT.2012.6283056",
language = "English (US)",
isbn = "9781467325790",
pages = "1246--1250",
booktitle = "2012 IEEE International Symposium on Information Theory Proceedings, ISIT 2012",

}

TY - GEN

T1 - Iterative estimation of constrained rank-one matrices in noise

AU - Rangan, Sundeep

AU - Fletcher, Alyson K.

PY - 2012

Y1 - 2012

N2 - We consider the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix. The probabilistic model can impose constraints on the factors including sparsity and positivity that arise commonly in learning problems. We propose a simple iterative procedure that reduces the problem to a sequence of scalar estimation computations. The method is similar to approximate message passing techniques based on Gaussian approximations of loopy belief propagation that have been used recently in compressed sensing. Leveraging analysis methods by Bayati and Montanari, we show that the asymptotic behavior of the estimates from the proposed iterative procedure is described by a simple scalar equivalent model, where the distribution of the estimates is identical to certain scalar estimates of the variables in Gaussian noise. Moreover, the effective Gaussian noise level is described by a set of state evolution equations. The proposed method thus provides a computationally simple and general method for rank-one estimation problems with a precise analysis in certain high-dimensional settings.

AB - We consider the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix. The probabilistic model can impose constraints on the factors including sparsity and positivity that arise commonly in learning problems. We propose a simple iterative procedure that reduces the problem to a sequence of scalar estimation computations. The method is similar to approximate message passing techniques based on Gaussian approximations of loopy belief propagation that have been used recently in compressed sensing. Leveraging analysis methods by Bayati and Montanari, we show that the asymptotic behavior of the estimates from the proposed iterative procedure is described by a simple scalar equivalent model, where the distribution of the estimates is identical to certain scalar estimates of the variables in Gaussian noise. Moreover, the effective Gaussian noise level is described by a set of state evolution equations. The proposed method thus provides a computationally simple and general method for rank-one estimation problems with a precise analysis in certain high-dimensional settings.

UR - http://www.scopus.com/inward/record.url?scp=84867561082&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84867561082&partnerID=8YFLogxK

U2 - 10.1109/ISIT.2012.6283056

DO - 10.1109/ISIT.2012.6283056

M3 - Conference contribution

AN - SCOPUS:84867561082

SN - 9781467325790

SP - 1246

EP - 1250

BT - 2012 IEEE International Symposium on Information Theory Proceedings, ISIT 2012

ER -