Learning Selectively Conditioned Forest Structures with Applications to DBNs and Classification
Abstract
Dealing with uncertainty in Bayesian Net?ork structures using maximum a posteriori (MAP) estimation or Bayesian Model Av?raging (BMA) is often intractable due to the superexponential number of possible di?ected, acyclic graphs. When the prior is decomposable, two classes of graphs where efficient learning can take place are tree-?tructures, and fixed?orderings with limited in?degree. We show how MAP estimates and BMA for selectively conditioned forests (SCF), a combination of these two classes, can be computed efficiently for ordered sets of variables. We apply SCFs to temporal data to learn Dynamic Bayesian Networks having an intra-?imestep forest and inter-?imestep limited in-?egree structure, improving model accuracy over DBNs without the combination of structures. We also apply SCFs to Bayes Net classification to learn selective forest?augmented Naive Bayes classifiers. We ar?ue that the built-?n feature selection of selec?ive augmented Bayes classifiers makes them preferable to similar non?selective classifiers based on empirical evidence.
BibTeX
@conference{Ziebart-2007-9774,author = {Brian D. Ziebart and Anind Dey and J. Andrew (Drew) Bagnell},
title = {Learning Selectively Conditioned Forest Structures with Applications to DBNs and Classification},
booktitle = {Proceedings of 23rd Conference on Uncertainty in Artificial Intelligence (UAI '07)},
year = {2007},
month = {July},
pages = {458 - 465},
keywords = {Bayesian Networks, structure learning, augmented naive Bayes},
}