Quantifying uncertainty for non-gaussian ensembles in complex systems

Rafail V. Abramov, Andrew J. Majda

Research output: Contribution to journalArticle

Abstract

Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short-term climate and weather prediction, examples of these issues might involve a lack of information in the historical climate record compared with an ensemble prediction, or a lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify this information. Here a recently developed mathematical theory for quantifying this lack of information is converted into a practical algorithmic tool. The theory involves explicit estimators obtained through convex optimization, principal predictability components, a signal/dispersion decomposition, etc. An explicit computationally feasible family of estimators is developed here for estimating the relative entropy over a large dimensional family of variables through a simple hierarchical strategy. Many facets of this computational strategy for estimating uncertainty are applied here for ensemble predictions for two "toy" climate models developed recently: the Galerkin truncation of the Burgers-Hopf equation and the Lorenz '96 model.

Original languageEnglish (US)
Pages (from-to)411-447
Number of pages37
JournalSIAM Journal on Scientific Computing
Volume26
Issue number2
DOIs
StatePublished - 2005

Fingerprint

Large scale systems
Complex Systems
Ensemble
Uncertainty
Relative Entropy
Prediction
Entropy
Climate
Climate models
Convex optimization
Estimator
Climate Models
Probability distributions
Predictability
Convex Optimization
Facet
Truncation
Weather
Galerkin
Decomposition

Keywords

  • Ensemble predictions
  • Predictability
  • Relative entropy

ASJC Scopus subject areas

  • Mathematics(all)
  • Applied Mathematics

Cite this

Quantifying uncertainty for non-gaussian ensembles in complex systems. / Abramov, Rafail V.; Majda, Andrew J.

In: SIAM Journal on Scientific Computing, Vol. 26, No. 2, 2005, p. 411-447.

Research output: Contribution to journalArticle

@article{cb4c8fc97c2149b2b9bf063fe9387e21,
title = "Quantifying uncertainty for non-gaussian ensembles in complex systems",
abstract = "Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short-term climate and weather prediction, examples of these issues might involve a lack of information in the historical climate record compared with an ensemble prediction, or a lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify this information. Here a recently developed mathematical theory for quantifying this lack of information is converted into a practical algorithmic tool. The theory involves explicit estimators obtained through convex optimization, principal predictability components, a signal/dispersion decomposition, etc. An explicit computationally feasible family of estimators is developed here for estimating the relative entropy over a large dimensional family of variables through a simple hierarchical strategy. Many facets of this computational strategy for estimating uncertainty are applied here for ensemble predictions for two {"}toy{"} climate models developed recently: the Galerkin truncation of the Burgers-Hopf equation and the Lorenz '96 model.",
keywords = "Ensemble predictions, Predictability, Relative entropy",
author = "Abramov, {Rafail V.} and Majda, {Andrew J.}",
year = "2005",
doi = "10.1137/S1064827503426310",
language = "English (US)",
volume = "26",
pages = "411--447",
journal = "SIAM Journal of Scientific Computing",
issn = "1064-8275",
publisher = "Society for Industrial and Applied Mathematics Publications",
number = "2",

}

TY - JOUR

T1 - Quantifying uncertainty for non-gaussian ensembles in complex systems

AU - Abramov, Rafail V.

AU - Majda, Andrew J.

PY - 2005

Y1 - 2005

N2 - Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short-term climate and weather prediction, examples of these issues might involve a lack of information in the historical climate record compared with an ensemble prediction, or a lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify this information. Here a recently developed mathematical theory for quantifying this lack of information is converted into a practical algorithmic tool. The theory involves explicit estimators obtained through convex optimization, principal predictability components, a signal/dispersion decomposition, etc. An explicit computationally feasible family of estimators is developed here for estimating the relative entropy over a large dimensional family of variables through a simple hierarchical strategy. Many facets of this computational strategy for estimating uncertainty are applied here for ensemble predictions for two "toy" climate models developed recently: the Galerkin truncation of the Burgers-Hopf equation and the Lorenz '96 model.

AB - Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short-term climate and weather prediction, examples of these issues might involve a lack of information in the historical climate record compared with an ensemble prediction, or a lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify this information. Here a recently developed mathematical theory for quantifying this lack of information is converted into a practical algorithmic tool. The theory involves explicit estimators obtained through convex optimization, principal predictability components, a signal/dispersion decomposition, etc. An explicit computationally feasible family of estimators is developed here for estimating the relative entropy over a large dimensional family of variables through a simple hierarchical strategy. Many facets of this computational strategy for estimating uncertainty are applied here for ensemble predictions for two "toy" climate models developed recently: the Galerkin truncation of the Burgers-Hopf equation and the Lorenz '96 model.

KW - Ensemble predictions

KW - Predictability

KW - Relative entropy

UR - http://www.scopus.com/inward/record.url?scp=16244415816&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=16244415816&partnerID=8YFLogxK

U2 - 10.1137/S1064827503426310

DO - 10.1137/S1064827503426310

M3 - Article

AN - SCOPUS:16244415816

VL - 26

SP - 411

EP - 447

JO - SIAM Journal of Scientific Computing

JF - SIAM Journal of Scientific Computing

SN - 1064-8275

IS - 2

ER -