Learning Thin Junction Trees via Graph Cuts - Robotics Institute Carnegie Mellon University

Learning Thin Junction Trees via Graph Cuts

Dafna Shahaf, Anton Chechetka, and Carlos Ernesto Guestrin
Conference Paper, Proceedings of 12th International Conference on Artificial Intelligence and Statistics (AISTATS '09), pp. 113 - 122, April, 2009

Abstract

Structure learning algorithms usually focus on the compactness of the learned model. However, for general compact models, both exact and approximate inference are still NP-hard. Therefore, the focus only on compactness leads to learning models that require approximate inference techniques, thus reducing their prediction quality. In this paper, we propose a method for learning an attractive class of models: bounded-treewidth junction trees, which permit both compact representation of probability distributions and efficient exact inference. Using Bethe approximation of the likelihood, we transform the problem of finding a good junction tree separator into a minimum cut problem on a weighted graph. Using the graph cut intuition, we present an efficient algorithm with theoretical guarantees for finding good separators, which we recursively apply to obtain a thin junction tree. Our extensive empirical evaluation demonstrates the benefit of applying exact inference using our models to answer queries. We also extend our technique to learning low tree-width conditional random fields, and demonstrate significant improvements over state of the art block-L1 regularization techniques.

BibTeX

@conference{Shahaf-2009-10199,
author = {Dafna Shahaf and Anton Chechetka and Carlos Ernesto Guestrin},
title = {Learning Thin Junction Trees via Graph Cuts},
booktitle = {Proceedings of 12th International Conference on Artificial Intelligence and Statistics (AISTATS '09)},
year = {2009},
month = {April},
pages = {113 - 122},
}