Max-margin learning with the Bayes factor

Rahul G. Krishnan, Arjun Khandelwal, Rajesh Ranganath, David Sontag

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose a new way to answer probabil tic queries that span multiple datapoints. W formalize reasoning about the similarity of d ferent datapoints as the evaluation of the Bay Factor within a hierarchical deep generati model that enforces a separation between th latent variables used for representation learnin and those used for reasoning. Under this mod we derive an intuitive estimator for the Bay Factor that represents similarity as the amou of overlap in representation space shared by d ferent points. The estimator we derive relies o a query-conditional latent reasoning networ that parameterizes a distribution over the latent space of the deep generative model. The latent reasoning network is trained to amortize the posterior-predictive distribution under a hierarchical model using supervised data and a max-margin learning algorithm. We explore how the model may be used to focus the data variations captured in the latent space of the deep generative model and how this may be used to build new algorithms for few-shot learning.

Original languageEnglish (US)
Title of host publication34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018
EditorsRicardo Silva, Amir Globerson, Amir Globerson
PublisherAssociation For Uncertainty in Artificial Intelligence (AUAI)
Pages896-905
Number of pages10
Volume2
ISBN (Electronic)9781510871601
StatePublished - Jan 1 2018
Event34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018 - Monterey, United States
Duration: Aug 6 2018Aug 10 2018

Other

Other34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018
CountryUnited States
CityMonterey
Period8/6/188/10/18

Fingerprint

Learning algorithms

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this

Krishnan, R. G., Khandelwal, A., Ranganath, R., & Sontag, D. (2018). Max-margin learning with the Bayes factor. In R. Silva, A. Globerson, & A. Globerson (Eds.), 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018 (Vol. 2, pp. 896-905). Association For Uncertainty in Artificial Intelligence (AUAI).

Max-margin learning with the Bayes factor. / Krishnan, Rahul G.; Khandelwal, Arjun; Ranganath, Rajesh; Sontag, David.

34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018. ed. / Ricardo Silva; Amir Globerson; Amir Globerson. Vol. 2 Association For Uncertainty in Artificial Intelligence (AUAI), 2018. p. 896-905.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Krishnan, RG, Khandelwal, A, Ranganath, R & Sontag, D 2018, Max-margin learning with the Bayes factor. in R Silva, A Globerson & A Globerson (eds), 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018. vol. 2, Association For Uncertainty in Artificial Intelligence (AUAI), pp. 896-905, 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018, Monterey, United States, 8/6/18.
Krishnan RG, Khandelwal A, Ranganath R, Sontag D. Max-margin learning with the Bayes factor. In Silva R, Globerson A, Globerson A, editors, 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018. Vol. 2. Association For Uncertainty in Artificial Intelligence (AUAI). 2018. p. 896-905
Krishnan, Rahul G. ; Khandelwal, Arjun ; Ranganath, Rajesh ; Sontag, David. / Max-margin learning with the Bayes factor. 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018. editor / Ricardo Silva ; Amir Globerson ; Amir Globerson. Vol. 2 Association For Uncertainty in Artificial Intelligence (AUAI), 2018. pp. 896-905
@inproceedings{c64f485ec6414d0eaee4f854753e815b,
title = "Max-margin learning with the Bayes factor",
abstract = "We propose a new way to answer probabil tic queries that span multiple datapoints. W formalize reasoning about the similarity of d ferent datapoints as the evaluation of the Bay Factor within a hierarchical deep generati model that enforces a separation between th latent variables used for representation learnin and those used for reasoning. Under this mod we derive an intuitive estimator for the Bay Factor that represents similarity as the amou of overlap in representation space shared by d ferent points. The estimator we derive relies o a query-conditional latent reasoning networ that parameterizes a distribution over the latent space of the deep generative model. The latent reasoning network is trained to amortize the posterior-predictive distribution under a hierarchical model using supervised data and a max-margin learning algorithm. We explore how the model may be used to focus the data variations captured in the latent space of the deep generative model and how this may be used to build new algorithms for few-shot learning.",
author = "Krishnan, {Rahul G.} and Arjun Khandelwal and Rajesh Ranganath and David Sontag",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
volume = "2",
pages = "896--905",
editor = "Ricardo Silva and Amir Globerson and Amir Globerson",
booktitle = "34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018",
publisher = "Association For Uncertainty in Artificial Intelligence (AUAI)",

}

TY - GEN

T1 - Max-margin learning with the Bayes factor

AU - Krishnan, Rahul G.

AU - Khandelwal, Arjun

AU - Ranganath, Rajesh

AU - Sontag, David

PY - 2018/1/1

Y1 - 2018/1/1

N2 - We propose a new way to answer probabil tic queries that span multiple datapoints. W formalize reasoning about the similarity of d ferent datapoints as the evaluation of the Bay Factor within a hierarchical deep generati model that enforces a separation between th latent variables used for representation learnin and those used for reasoning. Under this mod we derive an intuitive estimator for the Bay Factor that represents similarity as the amou of overlap in representation space shared by d ferent points. The estimator we derive relies o a query-conditional latent reasoning networ that parameterizes a distribution over the latent space of the deep generative model. The latent reasoning network is trained to amortize the posterior-predictive distribution under a hierarchical model using supervised data and a max-margin learning algorithm. We explore how the model may be used to focus the data variations captured in the latent space of the deep generative model and how this may be used to build new algorithms for few-shot learning.

AB - We propose a new way to answer probabil tic queries that span multiple datapoints. W formalize reasoning about the similarity of d ferent datapoints as the evaluation of the Bay Factor within a hierarchical deep generati model that enforces a separation between th latent variables used for representation learnin and those used for reasoning. Under this mod we derive an intuitive estimator for the Bay Factor that represents similarity as the amou of overlap in representation space shared by d ferent points. The estimator we derive relies o a query-conditional latent reasoning networ that parameterizes a distribution over the latent space of the deep generative model. The latent reasoning network is trained to amortize the posterior-predictive distribution under a hierarchical model using supervised data and a max-margin learning algorithm. We explore how the model may be used to focus the data variations captured in the latent space of the deep generative model and how this may be used to build new algorithms for few-shot learning.

UR - http://www.scopus.com/inward/record.url?scp=85059398200&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85059398200&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85059398200

VL - 2

SP - 896

EP - 905

BT - 34th Conference on Uncertainty in Artificial Intelligence 2018, UAI 2018

A2 - Silva, Ricardo

A2 - Globerson, Amir

A2 - Globerson, Amir

PB - Association For Uncertainty in Artificial Intelligence (AUAI)

ER -