Learning Bayesian network structure using LP relaxations

Tommi Jaakkola, David Sontag, Amir Globerson, Marina Meila

Research output: Contribution to journalArticle

Abstract

We propose to solve the combinatorial problem of finding the highest scoring Bayesian network structure from data. This structure learning problem can be viewed as an inference problem where the variables specify the choice of parents for each node in the graph. The key combinatorial difficulty arises from the global constraint that the graph structure has to be acyclic. We cast the structure learning problem as a linear program over the polytope defined by valid acyclic structures. In relaxing this problem, we maintain an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints. If an integral solution is found, it is guaranteed to be the optimal Bayesian network. When the relaxation is not tight, the fast dual algorithms we develop remain useful in combination with a branch and bound method. Empirical results suggest that the method is competitive or faster than alternative exact methods based on dynamic programming.

Original languageEnglish (US)
Pages (from-to)358-365
Number of pages8
JournalJournal of Machine Learning Research
Volume9
StatePublished - 2010

Fingerprint

LP Relaxation
Bayesian networks
Bayesian Networks
Network Structure
Branch and bound method
Structure Learning
Polytope
Dynamic programming
Valid
Dual Algorithm
Branch and Bound Method
Global Constraints
Integral Solution
Exact Method
Combinatorial Problems
Graph in graph theory
Scoring
Linear Program
Fast Algorithm
Dynamic Programming

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Cite this

Jaakkola, T., Sontag, D., Globerson, A., & Meila, M. (2010). Learning Bayesian network structure using LP relaxations. Journal of Machine Learning Research, 9, 358-365.

Learning Bayesian network structure using LP relaxations. / Jaakkola, Tommi; Sontag, David; Globerson, Amir; Meila, Marina.

In: Journal of Machine Learning Research, Vol. 9, 2010, p. 358-365.

Research output: Contribution to journalArticle

Jaakkola, T, Sontag, D, Globerson, A & Meila, M 2010, 'Learning Bayesian network structure using LP relaxations', Journal of Machine Learning Research, vol. 9, pp. 358-365.
Jaakkola T, Sontag D, Globerson A, Meila M. Learning Bayesian network structure using LP relaxations. Journal of Machine Learning Research. 2010;9:358-365.
Jaakkola, Tommi ; Sontag, David ; Globerson, Amir ; Meila, Marina. / Learning Bayesian network structure using LP relaxations. In: Journal of Machine Learning Research. 2010 ; Vol. 9. pp. 358-365.
@article{72e88ce6a36742d4b944611b4e0131c2,
title = "Learning Bayesian network structure using LP relaxations",
abstract = "We propose to solve the combinatorial problem of finding the highest scoring Bayesian network structure from data. This structure learning problem can be viewed as an inference problem where the variables specify the choice of parents for each node in the graph. The key combinatorial difficulty arises from the global constraint that the graph structure has to be acyclic. We cast the structure learning problem as a linear program over the polytope defined by valid acyclic structures. In relaxing this problem, we maintain an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints. If an integral solution is found, it is guaranteed to be the optimal Bayesian network. When the relaxation is not tight, the fast dual algorithms we develop remain useful in combination with a branch and bound method. Empirical results suggest that the method is competitive or faster than alternative exact methods based on dynamic programming.",
author = "Tommi Jaakkola and David Sontag and Amir Globerson and Marina Meila",
year = "2010",
language = "English (US)",
volume = "9",
pages = "358--365",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Learning Bayesian network structure using LP relaxations

AU - Jaakkola, Tommi

AU - Sontag, David

AU - Globerson, Amir

AU - Meila, Marina

PY - 2010

Y1 - 2010

N2 - We propose to solve the combinatorial problem of finding the highest scoring Bayesian network structure from data. This structure learning problem can be viewed as an inference problem where the variables specify the choice of parents for each node in the graph. The key combinatorial difficulty arises from the global constraint that the graph structure has to be acyclic. We cast the structure learning problem as a linear program over the polytope defined by valid acyclic structures. In relaxing this problem, we maintain an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints. If an integral solution is found, it is guaranteed to be the optimal Bayesian network. When the relaxation is not tight, the fast dual algorithms we develop remain useful in combination with a branch and bound method. Empirical results suggest that the method is competitive or faster than alternative exact methods based on dynamic programming.

AB - We propose to solve the combinatorial problem of finding the highest scoring Bayesian network structure from data. This structure learning problem can be viewed as an inference problem where the variables specify the choice of parents for each node in the graph. The key combinatorial difficulty arises from the global constraint that the graph structure has to be acyclic. We cast the structure learning problem as a linear program over the polytope defined by valid acyclic structures. In relaxing this problem, we maintain an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints. If an integral solution is found, it is guaranteed to be the optimal Bayesian network. When the relaxation is not tight, the fast dual algorithms we develop remain useful in combination with a branch and bound method. Empirical results suggest that the method is competitive or faster than alternative exact methods based on dynamic programming.

UR - http://www.scopus.com/inward/record.url?scp=84862272478&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862272478&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:84862272478

VL - 9

SP - 358

EP - 365

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -