Black box variational inference

Rajesh Ranganath, Sean Gerrish, David M. Blei

Research output: Contribution to journalConference article

Abstract

Variational inference has become a widely used method to approximate posteriors in complex latent variables models. However, deriving a variational inference algorithm generally requires significant model-specific analysis. These efforts can hinder and deter us from quickly developing and exploring a variety of models for a problem at hand. In this paper, we present a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation. Our method is based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the variational distribution. We develop a number of methods to reduce the variance of the gradient, always maintaining the criterion that we want to avoid difficult model-based derivations. We evaluate our method against the corresponding black box sampling based methods. We find that our method reaches better predictive likelihoods much faster than sampling methods. Finally, we demonstrate that Black Box Variational Inference lets us easily explore a wide space of models by quickly constructing and evaluating several models of longitudinal healthcare data.

Original languageEnglish (US)
Pages (from-to)814-822
Number of pages9
JournalJournal of Machine Learning Research
Volume33
StatePublished - Jan 1 2014

Fingerprint

Black Box
Gradient
Latent Variable Models
Model
Sampling
Stochastic Optimization
Complex Variables
Sampling Methods
Healthcare
Likelihood
Model-based
Evaluate
Demonstrate

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Statistics and Probability
  • Artificial Intelligence

Cite this

Ranganath, R., Gerrish, S., & Blei, D. M. (2014). Black box variational inference. Journal of Machine Learning Research, 33, 814-822.

Black box variational inference. / Ranganath, Rajesh; Gerrish, Sean; Blei, David M.

In: Journal of Machine Learning Research, Vol. 33, 01.01.2014, p. 814-822.

Research output: Contribution to journalConference article

Ranganath, R, Gerrish, S & Blei, DM 2014, 'Black box variational inference', Journal of Machine Learning Research, vol. 33, pp. 814-822.
Ranganath R, Gerrish S, Blei DM. Black box variational inference. Journal of Machine Learning Research. 2014 Jan 1;33:814-822.
Ranganath, Rajesh ; Gerrish, Sean ; Blei, David M. / Black box variational inference. In: Journal of Machine Learning Research. 2014 ; Vol. 33. pp. 814-822.
@article{3b928130141f445595b379bc504ca7a8,
title = "Black box variational inference",
abstract = "Variational inference has become a widely used method to approximate posteriors in complex latent variables models. However, deriving a variational inference algorithm generally requires significant model-specific analysis. These efforts can hinder and deter us from quickly developing and exploring a variety of models for a problem at hand. In this paper, we present a {"}black box{"} variational inference algorithm, one that can be quickly applied to many models with little additional derivation. Our method is based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the variational distribution. We develop a number of methods to reduce the variance of the gradient, always maintaining the criterion that we want to avoid difficult model-based derivations. We evaluate our method against the corresponding black box sampling based methods. We find that our method reaches better predictive likelihoods much faster than sampling methods. Finally, we demonstrate that Black Box Variational Inference lets us easily explore a wide space of models by quickly constructing and evaluating several models of longitudinal healthcare data.",
author = "Rajesh Ranganath and Sean Gerrish and Blei, {David M.}",
year = "2014",
month = "1",
day = "1",
language = "English (US)",
volume = "33",
pages = "814--822",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Black box variational inference

AU - Ranganath, Rajesh

AU - Gerrish, Sean

AU - Blei, David M.

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Variational inference has become a widely used method to approximate posteriors in complex latent variables models. However, deriving a variational inference algorithm generally requires significant model-specific analysis. These efforts can hinder and deter us from quickly developing and exploring a variety of models for a problem at hand. In this paper, we present a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation. Our method is based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the variational distribution. We develop a number of methods to reduce the variance of the gradient, always maintaining the criterion that we want to avoid difficult model-based derivations. We evaluate our method against the corresponding black box sampling based methods. We find that our method reaches better predictive likelihoods much faster than sampling methods. Finally, we demonstrate that Black Box Variational Inference lets us easily explore a wide space of models by quickly constructing and evaluating several models of longitudinal healthcare data.

AB - Variational inference has become a widely used method to approximate posteriors in complex latent variables models. However, deriving a variational inference algorithm generally requires significant model-specific analysis. These efforts can hinder and deter us from quickly developing and exploring a variety of models for a problem at hand. In this paper, we present a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation. Our method is based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the variational distribution. We develop a number of methods to reduce the variance of the gradient, always maintaining the criterion that we want to avoid difficult model-based derivations. We evaluate our method against the corresponding black box sampling based methods. We find that our method reaches better predictive likelihoods much faster than sampling methods. Finally, we demonstrate that Black Box Variational Inference lets us easily explore a wide space of models by quickly constructing and evaluating several models of longitudinal healthcare data.

UR - http://www.scopus.com/inward/record.url?scp=84955506831&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84955506831&partnerID=8YFLogxK

M3 - Conference article

VL - 33

SP - 814

EP - 822

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -