A MAP Estimate that Maximizes Entropy—An Alternative Interpretation for an Autoregressive Model

Research output: Contribution to journalArticle

Abstract

It is shown here that when extrapolation of a sequence of data with unknown statistics is performed under two optimization constraints, viz. maximizing the entropy and maximizing the a posteriori (MAP) probability density function (PDF) of the unknown sample, the resulting estimate is the same as that of an Autoregressive (AR) model. This leads to the conclusion that the estimate from an AR model is optimum in the sense that it is the MAP estimate which maximizes entropy.

Original languageEnglish (US)
Pages (from-to)843-844
Number of pages2
JournalProceedings of the IEEE
Volume73
Issue number4
DOIs
StatePublished - 1985

Fingerprint

Entropy
Extrapolation
Probability density function
Statistics

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Cite this

A MAP Estimate that Maximizes Entropy—An Alternative Interpretation for an Autoregressive Model. / Pillai, Unnikrishna.

In: Proceedings of the IEEE, Vol. 73, No. 4, 1985, p. 843-844.

Research output: Contribution to journalArticle

@article{d1add240d37e449bbde584f937b4b2db,
title = "A MAP Estimate that Maximizes Entropy—An Alternative Interpretation for an Autoregressive Model",
abstract = "It is shown here that when extrapolation of a sequence of data with unknown statistics is performed under two optimization constraints, viz. maximizing the entropy and maximizing the a posteriori (MAP) probability density function (PDF) of the unknown sample, the resulting estimate is the same as that of an Autoregressive (AR) model. This leads to the conclusion that the estimate from an AR model is optimum in the sense that it is the MAP estimate which maximizes entropy.",
author = "Unnikrishna Pillai",
year = "1985",
doi = "10.1109/PROC.1985.13208",
language = "English (US)",
volume = "73",
pages = "843--844",
journal = "Proceedings of the IEEE",
issn = "0018-9219",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "4",

}

TY - JOUR

T1 - A MAP Estimate that Maximizes Entropy—An Alternative Interpretation for an Autoregressive Model

AU - Pillai, Unnikrishna

PY - 1985

Y1 - 1985

N2 - It is shown here that when extrapolation of a sequence of data with unknown statistics is performed under two optimization constraints, viz. maximizing the entropy and maximizing the a posteriori (MAP) probability density function (PDF) of the unknown sample, the resulting estimate is the same as that of an Autoregressive (AR) model. This leads to the conclusion that the estimate from an AR model is optimum in the sense that it is the MAP estimate which maximizes entropy.

AB - It is shown here that when extrapolation of a sequence of data with unknown statistics is performed under two optimization constraints, viz. maximizing the entropy and maximizing the a posteriori (MAP) probability density function (PDF) of the unknown sample, the resulting estimate is the same as that of an Autoregressive (AR) model. This leads to the conclusion that the estimate from an AR model is optimum in the sense that it is the MAP estimate which maximizes entropy.

UR - http://www.scopus.com/inward/record.url?scp=84941463836&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84941463836&partnerID=8YFLogxK

U2 - 10.1109/PROC.1985.13208

DO - 10.1109/PROC.1985.13208

M3 - Article

AN - SCOPUS:84941463836

VL - 73

SP - 843

EP - 844

JO - Proceedings of the IEEE

JF - Proceedings of the IEEE

SN - 0018-9219

IS - 4

ER -