Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized fisher information

Research output: Contribution to journalArticle

Abstract

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cramér-Rao inequality is a direct consequence of these two inequalities. In this paper, the inequalities above are extended to Ren yi entropy, pth moment, and generalized Fisher information. Generalized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Cramér-Rao inequality is derived as a consequence of these moment and Fisher information inequalities.

Original languageEnglish (US)
Pages (from-to)473-478
Number of pages6
JournalIEEE Transactions on Information Theory
Volume51
Issue number2
DOIs
StatePublished - Feb 2005

Fingerprint

entropy
Entropy
Random variables

Keywords

  • Entropy
  • Fisher information
  • Information measure
  • Information theory
  • Moment
  • Renyi entropy

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Information Systems

Cite this

@article{4f72b5aca2cc429383fc6795cb9da894,
title = "Cram{\'e}r-Rao and moment-entropy inequalities for Renyi entropy and generalized fisher information",
abstract = "The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cram{\'e}r-Rao inequality is a direct consequence of these two inequalities. In this paper, the inequalities above are extended to Ren yi entropy, pth moment, and generalized Fisher information. Generalized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Cram{\'e}r-Rao inequality is derived as a consequence of these moment and Fisher information inequalities.",
keywords = "Entropy, Fisher information, Information measure, Information theory, Moment, Renyi entropy",
author = "Erwin Lutwak and Deane Yang and Gaoyong Zhang",
year = "2005",
month = "2",
doi = "10.1109/TIT.2004.840871",
language = "English (US)",
volume = "51",
pages = "473--478",
journal = "IEEE Transactions on Information Theory",
issn = "0018-9448",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "2",

}

TY - JOUR

T1 - Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized fisher information

AU - Lutwak, Erwin

AU - Yang, Deane

AU - Zhang, Gaoyong

PY - 2005/2

Y1 - 2005/2

N2 - The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cramér-Rao inequality is a direct consequence of these two inequalities. In this paper, the inequalities above are extended to Ren yi entropy, pth moment, and generalized Fisher information. Generalized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Cramér-Rao inequality is derived as a consequence of these moment and Fisher information inequalities.

AB - The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cramér-Rao inequality is a direct consequence of these two inequalities. In this paper, the inequalities above are extended to Ren yi entropy, pth moment, and generalized Fisher information. Generalized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Cramér-Rao inequality is derived as a consequence of these moment and Fisher information inequalities.

KW - Entropy

KW - Fisher information

KW - Information measure

KW - Information theory

KW - Moment

KW - Renyi entropy

UR - http://www.scopus.com/inward/record.url?scp=13444267746&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=13444267746&partnerID=8YFLogxK

U2 - 10.1109/TIT.2004.840871

DO - 10.1109/TIT.2004.840871

M3 - Article

VL - 51

SP - 473

EP - 478

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 2

ER -