Necessary and sufficient conditions for sparsity pattern recovery

Alyson K. Fletcher, Sundeep Rangan, Vivek K. Goyal

Research output: Contribution to journalArticle

Abstract

The paper considers the problem of detecting the sparsity pattern of a κ-sparse vector in Rnfrom ampmu; random noisy measurements. A new necessary condition on the number of measurements for asymptotically reliable detection with maximum likelihood (ML) estimation and Gaussian measurement matrices is derived. This necessary condition for ML detection is compared against a sufficient condition for simple maximum correlation (MC) or thresholding algorithms. The analysis shows that the gap between thresholding and ML can be described by a simple expression in terms of the total signal-to-noise ratio (SNR), with the gap growing with increasing SNR. Thresholding is also compared against the more sophisticated Lasso and orthogonal matching pursuit (OMP) methods. At high SNRs, it is shown that the gap between Lasso and OMP over thresholding is described by the range of powers of the nonzero component values of the unknown signals. Specifically, the key benefit of Lasso and OMP over thresholding is the ability of Lasso and OMP to detect signals with relatively small components.

Original languageEnglish (US)
Pages (from-to)5758-5772
Number of pages15
JournalIEEE Transactions on Information Theory
Volume55
Issue number12
DOIs
StatePublished - Dec 2009

Fingerprint

Recovery
Maximum likelihood
Signal to noise ratio
Maximum likelihood estimation
ability
Values

Keywords

  • Compressed sensing
  • Convex optimization
  • Lasso
  • Maximum-likelihood (ML) estimation
  • Orthogonal matching pursuit (OMP)
  • Random matrices
  • Random projections
  • Sparse approximation
  • Subset selection
  • Thresholding

ASJC Scopus subject areas

  • Computer Science Applications
  • Information Systems
  • Library and Information Sciences

Cite this

Necessary and sufficient conditions for sparsity pattern recovery. / Fletcher, Alyson K.; Rangan, Sundeep; Goyal, Vivek K.

In: IEEE Transactions on Information Theory, Vol. 55, No. 12, 12.2009, p. 5758-5772.

Research output: Contribution to journalArticle

Fletcher, Alyson K. ; Rangan, Sundeep ; Goyal, Vivek K. / Necessary and sufficient conditions for sparsity pattern recovery. In: IEEE Transactions on Information Theory. 2009 ; Vol. 55, No. 12. pp. 5758-5772.
@article{e7bc1adeb54046b1bb85cb511a8e9d6f,
title = "Necessary and sufficient conditions for sparsity pattern recovery",
abstract = "The paper considers the problem of detecting the sparsity pattern of a κ-sparse vector in Rnfrom ampmu; random noisy measurements. A new necessary condition on the number of measurements for asymptotically reliable detection with maximum likelihood (ML) estimation and Gaussian measurement matrices is derived. This necessary condition for ML detection is compared against a sufficient condition for simple maximum correlation (MC) or thresholding algorithms. The analysis shows that the gap between thresholding and ML can be described by a simple expression in terms of the total signal-to-noise ratio (SNR), with the gap growing with increasing SNR. Thresholding is also compared against the more sophisticated Lasso and orthogonal matching pursuit (OMP) methods. At high SNRs, it is shown that the gap between Lasso and OMP over thresholding is described by the range of powers of the nonzero component values of the unknown signals. Specifically, the key benefit of Lasso and OMP over thresholding is the ability of Lasso and OMP to detect signals with relatively small components.",
keywords = "Compressed sensing, Convex optimization, Lasso, Maximum-likelihood (ML) estimation, Orthogonal matching pursuit (OMP), Random matrices, Random projections, Sparse approximation, Subset selection, Thresholding",
author = "Fletcher, {Alyson K.} and Sundeep Rangan and Goyal, {Vivek K.}",
year = "2009",
month = "12",
doi = "10.1109/TIT.2009.2032726",
language = "English (US)",
volume = "55",
pages = "5758--5772",
journal = "IEEE Transactions on Information Theory",
issn = "0018-9448",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "12",

}

TY - JOUR

T1 - Necessary and sufficient conditions for sparsity pattern recovery

AU - Fletcher, Alyson K.

AU - Rangan, Sundeep

AU - Goyal, Vivek K.

PY - 2009/12

Y1 - 2009/12

N2 - The paper considers the problem of detecting the sparsity pattern of a κ-sparse vector in Rnfrom ampmu; random noisy measurements. A new necessary condition on the number of measurements for asymptotically reliable detection with maximum likelihood (ML) estimation and Gaussian measurement matrices is derived. This necessary condition for ML detection is compared against a sufficient condition for simple maximum correlation (MC) or thresholding algorithms. The analysis shows that the gap between thresholding and ML can be described by a simple expression in terms of the total signal-to-noise ratio (SNR), with the gap growing with increasing SNR. Thresholding is also compared against the more sophisticated Lasso and orthogonal matching pursuit (OMP) methods. At high SNRs, it is shown that the gap between Lasso and OMP over thresholding is described by the range of powers of the nonzero component values of the unknown signals. Specifically, the key benefit of Lasso and OMP over thresholding is the ability of Lasso and OMP to detect signals with relatively small components.

AB - The paper considers the problem of detecting the sparsity pattern of a κ-sparse vector in Rnfrom ampmu; random noisy measurements. A new necessary condition on the number of measurements for asymptotically reliable detection with maximum likelihood (ML) estimation and Gaussian measurement matrices is derived. This necessary condition for ML detection is compared against a sufficient condition for simple maximum correlation (MC) or thresholding algorithms. The analysis shows that the gap between thresholding and ML can be described by a simple expression in terms of the total signal-to-noise ratio (SNR), with the gap growing with increasing SNR. Thresholding is also compared against the more sophisticated Lasso and orthogonal matching pursuit (OMP) methods. At high SNRs, it is shown that the gap between Lasso and OMP over thresholding is described by the range of powers of the nonzero component values of the unknown signals. Specifically, the key benefit of Lasso and OMP over thresholding is the ability of Lasso and OMP to detect signals with relatively small components.

KW - Compressed sensing

KW - Convex optimization

KW - Lasso

KW - Maximum-likelihood (ML) estimation

KW - Orthogonal matching pursuit (OMP)

KW - Random matrices

KW - Random projections

KW - Sparse approximation

KW - Subset selection

KW - Thresholding

UR - http://www.scopus.com/inward/record.url?scp=70449505052&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=70449505052&partnerID=8YFLogxK

U2 - 10.1109/TIT.2009.2032726

DO - 10.1109/TIT.2009.2032726

M3 - Article

AN - SCOPUS:70449505052

VL - 55

SP - 5758

EP - 5772

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 12

ER -