### Abstract

We address the numerical solution of infinite-dimensional inverse problems in the framework of Bayesian inference. In Part I of this paper [T. Bui-Thanh, O. Ghattas, J. Martin, and G. Stadler, SIAM J. Sci. Comput., 35 (2013), pp. A2494-A2523] we considered the linearized infinitedimensional inverse problem. In Part II, we relax the linearization assumption and consider the fully nonlinear infinite-dimensional inverse problem using a Markov chain Monte Carlo (MCMC) sampling method. To address the challenges of sampling high-dimensional probability density functions (pdfs) arising upon discretization of Bayesian inverse problems governed by PDEs, we build upon the stochastic Newton MCMC method. This method exploits problem structure by taking as a proposal density a local Gaussian approximation of the posterior pdf, whose covariance operator is given by the inverse of the local Hessian of the negative log posterior pdf. The construction of the covariance is made tractable by invoking a low-rank approximation of the data misfit component of the Hessian. Here we introduce an approximation of the stochastic Newton proposal in which we compute the low-rank-based Hessian at just the maximum a posteriori (MAP) point, and then reuse this Hessian at each MCMC step. We compare the performance of the proposed method to the original stochastic Newton MCMC method and to an independence sampler. The comparison of the three methods is conducted on a synthetic ice sheet inverse problem. For this problem, the stochastic Newton MCMC method with a MAP-based Hessian converges at least as rapidly as the original stochastic Newton MCMC method, but is far cheaper since it avoids recomputing the Hessian at each step. On the other hand, it is more expensive per sample than the independence sampler; however, its convergence is significantly more rapid, and thus overall it is much cheaper. Finally, we present extensive analysis and interpretation of the posterior distribution and classify directions in parameter space based on the extent to which they are informed by the prior or the observations.

Original language | English (US) |
---|---|

Pages (from-to) | A1525-A1555 |

Journal | SIAM Journal on Scientific Computing |

Volume | 36 |

Issue number | 4 |

DOIs | |

State | Published - 2014 |

### Fingerprint

### Keywords

- Bayesian inference
- Ice sheet dynamics
- Infinite-dimensional inverse problems
- Low-rank approximation
- MCMC
- Stochastic Newton
- Uncertainty quantification

### ASJC Scopus subject areas

- Computational Mathematics
- Applied Mathematics

### Cite this

*SIAM Journal on Scientific Computing*,

*36*(4), A1525-A1555. https://doi.org/10.1137/130934805

**A computational framework for infinite-dimensional Bayesian inverse problems, part II : Stochastic Newton mcmc with application to ice sheet flow inverse problems.** / Petra, Noemi; Martin, James; Stadler, Georg; Ghattas, Omar.

Research output: Contribution to journal › Article

*SIAM Journal on Scientific Computing*, vol. 36, no. 4, pp. A1525-A1555. https://doi.org/10.1137/130934805

}

TY - JOUR

T1 - A computational framework for infinite-dimensional Bayesian inverse problems, part II

T2 - Stochastic Newton mcmc with application to ice sheet flow inverse problems

AU - Petra, Noemi

AU - Martin, James

AU - Stadler, Georg

AU - Ghattas, Omar

PY - 2014

Y1 - 2014

N2 - We address the numerical solution of infinite-dimensional inverse problems in the framework of Bayesian inference. In Part I of this paper [T. Bui-Thanh, O. Ghattas, J. Martin, and G. Stadler, SIAM J. Sci. Comput., 35 (2013), pp. A2494-A2523] we considered the linearized infinitedimensional inverse problem. In Part II, we relax the linearization assumption and consider the fully nonlinear infinite-dimensional inverse problem using a Markov chain Monte Carlo (MCMC) sampling method. To address the challenges of sampling high-dimensional probability density functions (pdfs) arising upon discretization of Bayesian inverse problems governed by PDEs, we build upon the stochastic Newton MCMC method. This method exploits problem structure by taking as a proposal density a local Gaussian approximation of the posterior pdf, whose covariance operator is given by the inverse of the local Hessian of the negative log posterior pdf. The construction of the covariance is made tractable by invoking a low-rank approximation of the data misfit component of the Hessian. Here we introduce an approximation of the stochastic Newton proposal in which we compute the low-rank-based Hessian at just the maximum a posteriori (MAP) point, and then reuse this Hessian at each MCMC step. We compare the performance of the proposed method to the original stochastic Newton MCMC method and to an independence sampler. The comparison of the three methods is conducted on a synthetic ice sheet inverse problem. For this problem, the stochastic Newton MCMC method with a MAP-based Hessian converges at least as rapidly as the original stochastic Newton MCMC method, but is far cheaper since it avoids recomputing the Hessian at each step. On the other hand, it is more expensive per sample than the independence sampler; however, its convergence is significantly more rapid, and thus overall it is much cheaper. Finally, we present extensive analysis and interpretation of the posterior distribution and classify directions in parameter space based on the extent to which they are informed by the prior or the observations.

AB - We address the numerical solution of infinite-dimensional inverse problems in the framework of Bayesian inference. In Part I of this paper [T. Bui-Thanh, O. Ghattas, J. Martin, and G. Stadler, SIAM J. Sci. Comput., 35 (2013), pp. A2494-A2523] we considered the linearized infinitedimensional inverse problem. In Part II, we relax the linearization assumption and consider the fully nonlinear infinite-dimensional inverse problem using a Markov chain Monte Carlo (MCMC) sampling method. To address the challenges of sampling high-dimensional probability density functions (pdfs) arising upon discretization of Bayesian inverse problems governed by PDEs, we build upon the stochastic Newton MCMC method. This method exploits problem structure by taking as a proposal density a local Gaussian approximation of the posterior pdf, whose covariance operator is given by the inverse of the local Hessian of the negative log posterior pdf. The construction of the covariance is made tractable by invoking a low-rank approximation of the data misfit component of the Hessian. Here we introduce an approximation of the stochastic Newton proposal in which we compute the low-rank-based Hessian at just the maximum a posteriori (MAP) point, and then reuse this Hessian at each MCMC step. We compare the performance of the proposed method to the original stochastic Newton MCMC method and to an independence sampler. The comparison of the three methods is conducted on a synthetic ice sheet inverse problem. For this problem, the stochastic Newton MCMC method with a MAP-based Hessian converges at least as rapidly as the original stochastic Newton MCMC method, but is far cheaper since it avoids recomputing the Hessian at each step. On the other hand, it is more expensive per sample than the independence sampler; however, its convergence is significantly more rapid, and thus overall it is much cheaper. Finally, we present extensive analysis and interpretation of the posterior distribution and classify directions in parameter space based on the extent to which they are informed by the prior or the observations.

KW - Bayesian inference

KW - Ice sheet dynamics

KW - Infinite-dimensional inverse problems

KW - Low-rank approximation

KW - MCMC

KW - Stochastic Newton

KW - Uncertainty quantification

UR - http://www.scopus.com/inward/record.url?scp=84987755534&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84987755534&partnerID=8YFLogxK

U2 - 10.1137/130934805

DO - 10.1137/130934805

M3 - Article

VL - 36

SP - A1525-A1555

JO - SIAM Journal of Scientific Computing

JF - SIAM Journal of Scientific Computing

SN - 1064-8275

IS - 4

ER -