### Abstract

Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short-term climate and weather prediction, examples of these issues might involve a lack of information in the historical climate record compared with an ensemble prediction, or a lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify this information. Here a recently developed mathematical theory for quantifying this lack of information is converted into a practical algorithmic tool. The theory involves explicit estimators obtained through convex optimization, principal predictability components, a signal/dispersion decomposition, etc. An explicit computationally feasible family of estimators is developed here for estimating the relative entropy over a large dimensional family of variables through a simple hierarchical strategy. Many facets of this computational strategy for estimating uncertainty are applied here for ensemble predictions for two "toy" climate models developed recently: the Galerkin truncation of the Burgers-Hopf equation and the Lorenz '96 model.

Original language | English (US) |
---|---|

Pages (from-to) | 411-447 |

Number of pages | 37 |

Journal | SIAM Journal on Scientific Computing |

Volume | 26 |

Issue number | 2 |

DOIs | |

State | Published - 2005 |

### Fingerprint

### Keywords

- Ensemble predictions
- Predictability
- Relative entropy

### ASJC Scopus subject areas

- Mathematics(all)
- Applied Mathematics

### Cite this

*SIAM Journal on Scientific Computing*,

*26*(2), 411-447. https://doi.org/10.1137/S1064827503426310

**Quantifying uncertainty for non-gaussian ensembles in complex systems.** / Abramov, Rafail V.; Majda, Andrew J.

Research output: Contribution to journal › Article

*SIAM Journal on Scientific Computing*, vol. 26, no. 2, pp. 411-447. https://doi.org/10.1137/S1064827503426310

}

TY - JOUR

T1 - Quantifying uncertainty for non-gaussian ensembles in complex systems

AU - Abramov, Rafail V.

AU - Majda, Andrew J.

PY - 2005

Y1 - 2005

N2 - Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short-term climate and weather prediction, examples of these issues might involve a lack of information in the historical climate record compared with an ensemble prediction, or a lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify this information. Here a recently developed mathematical theory for quantifying this lack of information is converted into a practical algorithmic tool. The theory involves explicit estimators obtained through convex optimization, principal predictability components, a signal/dispersion decomposition, etc. An explicit computationally feasible family of estimators is developed here for estimating the relative entropy over a large dimensional family of variables through a simple hierarchical strategy. Many facets of this computational strategy for estimating uncertainty are applied here for ensemble predictions for two "toy" climate models developed recently: the Galerkin truncation of the Burgers-Hopf equation and the Lorenz '96 model.

AB - Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short-term climate and weather prediction, examples of these issues might involve a lack of information in the historical climate record compared with an ensemble prediction, or a lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify this information. Here a recently developed mathematical theory for quantifying this lack of information is converted into a practical algorithmic tool. The theory involves explicit estimators obtained through convex optimization, principal predictability components, a signal/dispersion decomposition, etc. An explicit computationally feasible family of estimators is developed here for estimating the relative entropy over a large dimensional family of variables through a simple hierarchical strategy. Many facets of this computational strategy for estimating uncertainty are applied here for ensemble predictions for two "toy" climate models developed recently: the Galerkin truncation of the Burgers-Hopf equation and the Lorenz '96 model.

KW - Ensemble predictions

KW - Predictability

KW - Relative entropy

UR - http://www.scopus.com/inward/record.url?scp=16244415816&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=16244415816&partnerID=8YFLogxK

U2 - 10.1137/S1064827503426310

DO - 10.1137/S1064827503426310

M3 - Article

AN - SCOPUS:16244415816

VL - 26

SP - 411

EP - 447

JO - SIAM Journal of Scientific Computing

JF - SIAM Journal of Scientific Computing

SN - 1064-8275

IS - 2

ER -