Tree block coordinate descent for MAP in graphical models

David Sontag, Tommi Jaakkola

Research output: Contribution to journalArticle

Abstract

A number of linear programming relaxations have been proposed for finding most likely settings of the variables (MAP) in large probabilistic models. The relaxations are often succinctly expressed in the dual and reduce to different types of reparameterizations of the original model. The dual objectives are typically solved by performing local block coordinate descent steps. In this work, we show how to perform block coordinate descent on spanning trees of the graphical model. We also show how all of the earlier dual algorithms are related to each other, giving transformations from one type of reparameterization to another while maintaining monotonicity relative to a common objective function. Finally, we quantify when the MAP solution can and cannot be decoded directly from the dual LP relaxation.

Original languageEnglish (US)
Pages (from-to)544-551
Number of pages8
JournalJournal of Machine Learning Research
Volume5
StatePublished - 2009

Fingerprint

Coordinate Descent
Graphical Models
Reparameterization
Linear programming
Dual Algorithm
LP Relaxation
Linear Programming Relaxation
Spanning tree
Probabilistic Model
Monotonicity
Quantify
Objective function
Likely
Model
Statistical Models

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Cite this

Tree block coordinate descent for MAP in graphical models. / Sontag, David; Jaakkola, Tommi.

In: Journal of Machine Learning Research, Vol. 5, 2009, p. 544-551.

Research output: Contribution to journalArticle

Sontag, David ; Jaakkola, Tommi. / Tree block coordinate descent for MAP in graphical models. In: Journal of Machine Learning Research. 2009 ; Vol. 5. pp. 544-551.
@article{6a9dae8053984196a1559efb7cf425d6,
title = "Tree block coordinate descent for MAP in graphical models",
abstract = "A number of linear programming relaxations have been proposed for finding most likely settings of the variables (MAP) in large probabilistic models. The relaxations are often succinctly expressed in the dual and reduce to different types of reparameterizations of the original model. The dual objectives are typically solved by performing local block coordinate descent steps. In this work, we show how to perform block coordinate descent on spanning trees of the graphical model. We also show how all of the earlier dual algorithms are related to each other, giving transformations from one type of reparameterization to another while maintaining monotonicity relative to a common objective function. Finally, we quantify when the MAP solution can and cannot be decoded directly from the dual LP relaxation.",
author = "David Sontag and Tommi Jaakkola",
year = "2009",
language = "English (US)",
volume = "5",
pages = "544--551",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Tree block coordinate descent for MAP in graphical models

AU - Sontag, David

AU - Jaakkola, Tommi

PY - 2009

Y1 - 2009

N2 - A number of linear programming relaxations have been proposed for finding most likely settings of the variables (MAP) in large probabilistic models. The relaxations are often succinctly expressed in the dual and reduce to different types of reparameterizations of the original model. The dual objectives are typically solved by performing local block coordinate descent steps. In this work, we show how to perform block coordinate descent on spanning trees of the graphical model. We also show how all of the earlier dual algorithms are related to each other, giving transformations from one type of reparameterization to another while maintaining monotonicity relative to a common objective function. Finally, we quantify when the MAP solution can and cannot be decoded directly from the dual LP relaxation.

AB - A number of linear programming relaxations have been proposed for finding most likely settings of the variables (MAP) in large probabilistic models. The relaxations are often succinctly expressed in the dual and reduce to different types of reparameterizations of the original model. The dual objectives are typically solved by performing local block coordinate descent steps. In this work, we show how to perform block coordinate descent on spanning trees of the graphical model. We also show how all of the earlier dual algorithms are related to each other, giving transformations from one type of reparameterization to another while maintaining monotonicity relative to a common objective function. Finally, we quantify when the MAP solution can and cannot be decoded directly from the dual LP relaxation.

UR - http://www.scopus.com/inward/record.url?scp=82455214670&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=82455214670&partnerID=8YFLogxK

M3 - Article

VL - 5

SP - 544

EP - 551

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -