### Abstract

Original language | Undefined |
---|---|

Article number | 1202.2759 |

Journal | arXiv |

State | Published - Feb 13 2012 |

### Keywords

- cs.IT
- math.IT

### Cite this

*arXiv*, [1202.2759].

**Iterative reconstruction of rank-one matrices in noise.** / Fletcher, Alyson K.; Rangan, Sundeep.

Research output: Contribution to journal › Article

}

TY - JOUR

T1 - Iterative reconstruction of rank-one matrices in noise

AU - Fletcher, Alyson K.

AU - Rangan, Sundeep

N1 - 28 pages, 2 figures

PY - 2012/2/13

Y1 - 2012/2/13

N2 - We consider the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix. The probabilistic model can impose constraints on the factors including sparsity and positivity that arise commonly in learning problems. We propose a family of algorithms that reduce the problem to a sequence of scalar estimation computations. These algorithms are similar to approximate message passing techniques based on Gaussian approximations of loopy belief propagation that have been used recently in compressed sensing. Leveraging analysis methods by Bayati and Montanari, we show that the asymptotic behavior of the algorithm is described by a simple scalar equivalent model, where the distribution of the estimates at each iteration is identical to certain scalar estimates of the variables in Gaussian noise. Moreover, the effective Gaussian noise level is described by a set of state evolution equations. The proposed approach to deriving algorithms thus provides a computationally simple and general method for rank-one estimation problems with a precise analysis in certain high-dimensional settings.

AB - We consider the problem of estimating a rank-one matrix in Gaussian noise under a probabilistic model for the left and right factors of the matrix. The probabilistic model can impose constraints on the factors including sparsity and positivity that arise commonly in learning problems. We propose a family of algorithms that reduce the problem to a sequence of scalar estimation computations. These algorithms are similar to approximate message passing techniques based on Gaussian approximations of loopy belief propagation that have been used recently in compressed sensing. Leveraging analysis methods by Bayati and Montanari, we show that the asymptotic behavior of the algorithm is described by a simple scalar equivalent model, where the distribution of the estimates at each iteration is identical to certain scalar estimates of the variables in Gaussian noise. Moreover, the effective Gaussian noise level is described by a set of state evolution equations. The proposed approach to deriving algorithms thus provides a computationally simple and general method for rank-one estimation problems with a precise analysis in certain high-dimensional settings.

KW - cs.IT

KW - math.IT

M3 - Article

JO - arXiv

JF - arXiv

M1 - 1202.2759

ER -