Variational sequential Monte Carlo

Christian A. Naesseth, Scott W. Linderman, Rajesh Ranganath, David M. Blei

Research output: Contribution to conferencePaper

Abstract

Many recent advances in large scale probabilistic inference rely on variational methods. The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior. In this paper we present a new approximating family of distributions, the variational sequential Monte Carlo (VSMC) family, and show how to optimize it in variational inference. VSMC melds variational inference (VI) and sequential Monte Carlo (SMC), providing practitioners with flexible, accurate, and powerful Bayesian inference. The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters. We demonstrate its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits.

Original languageEnglish (US)
Pages968-977
Number of pages10
StatePublished - Jan 1 2018
Event21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018 - Playa Blanca, Lanzarote, Canary Islands, Spain
Duration: Apr 9 2018Apr 11 2018

Conference

Conference21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018
CountrySpain
CityPlaya Blanca, Lanzarote, Canary Islands
Period4/9/184/11/18

Fingerprint

Sequential Monte Carlo
Stochastic models
Brain
Networks (circuits)
Probabilistic Inference
Financial Data
Stochastic Volatility Model
Variational Approach
State-space Model
Bayesian inference
Variational Methods
Markov Model
Family
Optimise
Optimization
Demonstrate

ASJC Scopus subject areas

  • Statistics and Probability
  • Artificial Intelligence

Cite this

Naesseth, C. A., Linderman, S. W., Ranganath, R., & Blei, D. M. (2018). Variational sequential Monte Carlo. 968-977. Paper presented at 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018, Playa Blanca, Lanzarote, Canary Islands, Spain.

Variational sequential Monte Carlo. / Naesseth, Christian A.; Linderman, Scott W.; Ranganath, Rajesh; Blei, David M.

2018. 968-977 Paper presented at 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018, Playa Blanca, Lanzarote, Canary Islands, Spain.

Research output: Contribution to conferencePaper

Naesseth, CA, Linderman, SW, Ranganath, R & Blei, DM 2018, 'Variational sequential Monte Carlo' Paper presented at 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018, Playa Blanca, Lanzarote, Canary Islands, Spain, 4/9/18 - 4/11/18, pp. 968-977.
Naesseth CA, Linderman SW, Ranganath R, Blei DM. Variational sequential Monte Carlo. 2018. Paper presented at 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018, Playa Blanca, Lanzarote, Canary Islands, Spain.
Naesseth, Christian A. ; Linderman, Scott W. ; Ranganath, Rajesh ; Blei, David M. / Variational sequential Monte Carlo. Paper presented at 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018, Playa Blanca, Lanzarote, Canary Islands, Spain.10 p.
@conference{df108e831233493984d29bd97e963901,
title = "Variational sequential Monte Carlo",
abstract = "Many recent advances in large scale probabilistic inference rely on variational methods. The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior. In this paper we present a new approximating family of distributions, the variational sequential Monte Carlo (VSMC) family, and show how to optimize it in variational inference. VSMC melds variational inference (VI) and sequential Monte Carlo (SMC), providing practitioners with flexible, accurate, and powerful Bayesian inference. The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters. We demonstrate its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits.",
author = "Naesseth, {Christian A.} and Linderman, {Scott W.} and Rajesh Ranganath and Blei, {David M.}",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
pages = "968--977",
note = "21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018 ; Conference date: 09-04-2018 Through 11-04-2018",

}

TY - CONF

T1 - Variational sequential Monte Carlo

AU - Naesseth, Christian A.

AU - Linderman, Scott W.

AU - Ranganath, Rajesh

AU - Blei, David M.

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Many recent advances in large scale probabilistic inference rely on variational methods. The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior. In this paper we present a new approximating family of distributions, the variational sequential Monte Carlo (VSMC) family, and show how to optimize it in variational inference. VSMC melds variational inference (VI) and sequential Monte Carlo (SMC), providing practitioners with flexible, accurate, and powerful Bayesian inference. The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters. We demonstrate its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits.

AB - Many recent advances in large scale probabilistic inference rely on variational methods. The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior. In this paper we present a new approximating family of distributions, the variational sequential Monte Carlo (VSMC) family, and show how to optimize it in variational inference. VSMC melds variational inference (VI) and sequential Monte Carlo (SMC), providing practitioners with flexible, accurate, and powerful Bayesian inference. The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters. We demonstrate its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits.

UR - http://www.scopus.com/inward/record.url?scp=85057232074&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85057232074&partnerID=8YFLogxK

M3 - Paper

SP - 968

EP - 977

ER -