Extensions of fisher information and stam's inequality

Research output: Contribution to journalArticle

Abstract

We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam's inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more general sharp inequalities satisfied by random vectors. The extremal random vectors, which we call generalized Gaussians, contain Gaussians as a limiting case but are noteworthy because they are heavy-tailed.

Original languageEnglish (US)
Article number6157091
Pages (from-to)1319-1327
Number of pages9
JournalIEEE Transactions on Information Theory
Volume58
Issue number3
DOIs
StatePublished - Mar 2012

Fingerprint

Fisher information matrix
entropy
Random variables
Entropy

Keywords

  • Entropy
  • Fisher information
  • information measure
  • information theory
  • Rényi entropy
  • Shannon entropy
  • Shannon theory
  • Stam inequality

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this

Extensions of fisher information and stam's inequality. / Lutwak, Erwin; Lv, Songjun; Yang, Deane; Zhang, Gaoyong.

In: IEEE Transactions on Information Theory, Vol. 58, No. 3, 6157091, 03.2012, p. 1319-1327.

Research output: Contribution to journalArticle

@article{6a15d82088cf42a28fe7a13e461ab856,
title = "Extensions of fisher information and stam's inequality",
abstract = "We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam's inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more general sharp inequalities satisfied by random vectors. The extremal random vectors, which we call generalized Gaussians, contain Gaussians as a limiting case but are noteworthy because they are heavy-tailed.",
keywords = "Entropy, Fisher information, information measure, information theory, R{\'e}nyi entropy, Shannon entropy, Shannon theory, Stam inequality",
author = "Erwin Lutwak and Songjun Lv and Deane Yang and Gaoyong Zhang",
year = "2012",
month = "3",
doi = "10.1109/TIT.2011.2177563",
language = "English (US)",
volume = "58",
pages = "1319--1327",
journal = "IEEE Transactions on Information Theory",
issn = "0018-9448",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "3",

}

TY - JOUR

T1 - Extensions of fisher information and stam's inequality

AU - Lutwak, Erwin

AU - Lv, Songjun

AU - Yang, Deane

AU - Zhang, Gaoyong

PY - 2012/3

Y1 - 2012/3

N2 - We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam's inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more general sharp inequalities satisfied by random vectors. The extremal random vectors, which we call generalized Gaussians, contain Gaussians as a limiting case but are noteworthy because they are heavy-tailed.

AB - We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam's inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more general sharp inequalities satisfied by random vectors. The extremal random vectors, which we call generalized Gaussians, contain Gaussians as a limiting case but are noteworthy because they are heavy-tailed.

KW - Entropy

KW - Fisher information

KW - information measure

KW - information theory

KW - Rényi entropy

KW - Shannon entropy

KW - Shannon theory

KW - Stam inequality

UR - http://www.scopus.com/inward/record.url?scp=84863236970&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84863236970&partnerID=8YFLogxK

U2 - 10.1109/TIT.2011.2177563

DO - 10.1109/TIT.2011.2177563

M3 - Article

VL - 58

SP - 1319

EP - 1327

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 3

M1 - 6157091

ER -