Uniqueness of a two-step predictor based spectral estimator that generalizes the maximum entropy concept

Theodore I. Shim, Unnikrishna Pillai, Won Cheol Lee

Research output: Contribution to journalArticle

Abstract

Given a finite set of autocorrelations, it is well known that maximization of the entropy functions subject to this data leads to a stable autoregressive (AR) model. Since maximization of the entropy functions is equivalent to maximization of the minimum mean square error associated with one-step predictors, the problem of obtaining admissible extensions that maximize the k-step minimum mean square prediction error subject to the given autocorrelations is meaningful, and it has been shown to result in stable ARMA extensions (see the work by Pillar et al.). The uniqueness of this true generalization of the minimum entropy extension is proved here through a constructive procedure in the case to two-step predictors.

Original languageEnglish (US)
Pages (from-to)2942-2946
Number of pages5
JournalIEEE Transactions on Signal Processing
Volume41
Issue number9
DOIs
StatePublished - Sep 1993

Fingerprint

Entropy
Autocorrelation
Mean square error

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Uniqueness of a two-step predictor based spectral estimator that generalizes the maximum entropy concept. / Shim, Theodore I.; Pillai, Unnikrishna; Lee, Won Cheol.

In: IEEE Transactions on Signal Processing, Vol. 41, No. 9, 09.1993, p. 2942-2946.

Research output: Contribution to journalArticle

@article{4c6783e537de4b22bd1a604d61585273,
title = "Uniqueness of a two-step predictor based spectral estimator that generalizes the maximum entropy concept",
abstract = "Given a finite set of autocorrelations, it is well known that maximization of the entropy functions subject to this data leads to a stable autoregressive (AR) model. Since maximization of the entropy functions is equivalent to maximization of the minimum mean square error associated with one-step predictors, the problem of obtaining admissible extensions that maximize the k-step minimum mean square prediction error subject to the given autocorrelations is meaningful, and it has been shown to result in stable ARMA extensions (see the work by Pillar et al.). The uniqueness of this true generalization of the minimum entropy extension is proved here through a constructive procedure in the case to two-step predictors.",
author = "Shim, {Theodore I.} and Unnikrishna Pillai and Lee, {Won Cheol}",
year = "1993",
month = "9",
doi = "10.1109/78.236517",
language = "English (US)",
volume = "41",
pages = "2942--2946",
journal = "IEEE Transactions on Signal Processing",
issn = "1053-587X",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "9",

}

TY - JOUR

T1 - Uniqueness of a two-step predictor based spectral estimator that generalizes the maximum entropy concept

AU - Shim, Theodore I.

AU - Pillai, Unnikrishna

AU - Lee, Won Cheol

PY - 1993/9

Y1 - 1993/9

N2 - Given a finite set of autocorrelations, it is well known that maximization of the entropy functions subject to this data leads to a stable autoregressive (AR) model. Since maximization of the entropy functions is equivalent to maximization of the minimum mean square error associated with one-step predictors, the problem of obtaining admissible extensions that maximize the k-step minimum mean square prediction error subject to the given autocorrelations is meaningful, and it has been shown to result in stable ARMA extensions (see the work by Pillar et al.). The uniqueness of this true generalization of the minimum entropy extension is proved here through a constructive procedure in the case to two-step predictors.

AB - Given a finite set of autocorrelations, it is well known that maximization of the entropy functions subject to this data leads to a stable autoregressive (AR) model. Since maximization of the entropy functions is equivalent to maximization of the minimum mean square error associated with one-step predictors, the problem of obtaining admissible extensions that maximize the k-step minimum mean square prediction error subject to the given autocorrelations is meaningful, and it has been shown to result in stable ARMA extensions (see the work by Pillar et al.). The uniqueness of this true generalization of the minimum entropy extension is proved here through a constructive procedure in the case to two-step predictors.

UR - http://www.scopus.com/inward/record.url?scp=0027666933&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027666933&partnerID=8YFLogxK

U2 - 10.1109/78.236517

DO - 10.1109/78.236517

M3 - Article

AN - SCOPUS:0027666933

VL - 41

SP - 2942

EP - 2946

JO - IEEE Transactions on Signal Processing

JF - IEEE Transactions on Signal Processing

SN - 1053-587X

IS - 9

ER -