Deep exponential families

Rajesh Ranganath, Linpeng Tang, Laurent Charlin, D. Laur Blei

Research output: Contribution to journalConference article

Abstract

We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent "black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.

Original languageEnglish (US)
Pages (from-to)762-771
Number of pages10
JournalJournal of Machine Learning Research
Volume38
StatePublished - Jan 1 2015

Fingerprint

Exponential Family
Latent Variable Models
Latent Variables
Black Box
Large Data Sets
Pairwise
Recommendations
Neural Networks
Evaluate
Prediction
Model
Demonstrate

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Cite this

Ranganath, R., Tang, L., Charlin, L., & Blei, D. L. (2015). Deep exponential families. Journal of Machine Learning Research, 38, 762-771.

Deep exponential families. / Ranganath, Rajesh; Tang, Linpeng; Charlin, Laurent; Blei, D. Laur.

In: Journal of Machine Learning Research, Vol. 38, 01.01.2015, p. 762-771.

Research output: Contribution to journalConference article

Ranganath, R, Tang, L, Charlin, L & Blei, DL 2015, 'Deep exponential families', Journal of Machine Learning Research, vol. 38, pp. 762-771.
Ranganath R, Tang L, Charlin L, Blei DL. Deep exponential families. Journal of Machine Learning Research. 2015 Jan 1;38:762-771.
Ranganath, Rajesh ; Tang, Linpeng ; Charlin, Laurent ; Blei, D. Laur. / Deep exponential families. In: Journal of Machine Learning Research. 2015 ; Vol. 38. pp. 762-771.
@article{95a1b3f5a30949f8a609482a6764f0e0,
title = "Deep exponential families",
abstract = "We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent {"}black box{"} variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.",
author = "Rajesh Ranganath and Linpeng Tang and Laurent Charlin and Blei, {D. Laur}",
year = "2015",
month = "1",
day = "1",
language = "English (US)",
volume = "38",
pages = "762--771",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Deep exponential families

AU - Ranganath, Rajesh

AU - Tang, Linpeng

AU - Charlin, Laurent

AU - Blei, D. Laur

PY - 2015/1/1

Y1 - 2015/1/1

N2 - We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent "black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.

AB - We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent "black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.

UR - http://www.scopus.com/inward/record.url?scp=84954313610&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84954313610&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84954313610

VL - 38

SP - 762

EP - 771

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -