Modeling Interaction via the Principle of Maximum Causal Entropy
Conference Paper, Proceedings of (ICML) International Conference on Machine Learning, pp. 1255 - 1262, June, 2010
Abstract
The principle of maximum entropy provides a powerful framework for statistical models of joint, conditional, and marginal distributions. However, there are many important distributions with elements of interaction and feedback where its applicability has not been established. This work presents the principle of maximum causal entropy—an approach based on causally conditioned probabilities that can appropriately model the availability and influence of sequentially revealed side information. Using this principle, we derive models for sequential data with revealed information, interaction, and feedback, and demonstrate their applicability for statistically framing inverse optimal control and decision prediction tasks.
BibTeX
@conference{Ziebart-2010-10470,author = {Brian D. Ziebart and J. Andrew (Drew) Bagnell and Anind Dey},
title = {Modeling Interaction via the Principle of Maximum Causal Entropy},
booktitle = {Proceedings of (ICML) International Conference on Machine Learning},
year = {2010},
month = {June},
pages = {1255 - 1262},
keywords = {Maximum Entropy, Causal Entropy, interaction, inverse optimal control, game theory},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.