Perceptual Interpretation for Autonomous Navigation through Dynamic Imitation Learning - Robotics Institute Carnegie Mellon University

Perceptual Interpretation for Autonomous Navigation through Dynamic Imitation Learning

Conference Paper, Proceedings of 14th International Symposium of Robotics Research (ISRR '09), pp. 433 - 449, August, 2009

Abstract

Achieving high performance autonomous navigation is a central goal of field robotics. Efficient navigation by a mobile robot depends not only on the individual performance of perception and planning systems, but on how well these systems are coupled. When the perception problem is clearly defined, as in well structured environments, this coupling (in the form of a cost function) is also well defined. However, as environments become less structured and more difficult to interpret, more complex cost functions are required, increasing the difficulty of their design. Recently, a class of machine learning techniques has been developed that rely upon expert demonstration to develop a function mapping perceptual data to costs. These algorithms choose the cost function such that the robot's planned behavior mimics an expert's demonstration as closely as possible. In this work, we extend these methods to address the challenge of dynamic and incomplete online perceptual data, as well as noisy and imperfect expert demonstration. We validate our approach on a large scale outdoor robot with hundreds of kilometers of autonomous navigation through complex natural terrains.

BibTeX

@conference{Silver-2009-10295,
author = {David Silver and J. Andrew (Drew) Bagnell and Anthony (Tony) Stentz},
title = {Perceptual Interpretation for Autonomous Navigation through Dynamic Imitation Learning},
booktitle = {Proceedings of 14th International Symposium of Robotics Research (ISRR '09)},
year = {2009},
month = {August},
pages = {433 - 449},
}