Probabilistic Planning for Robotic Exploration
Abstract
Robotic exploration tasks involve inherent uncertainty. They typically include navigating through unknown terrain, searching for features that may or may not be present, and intelligently reacting to data from noisy sensors (for example, a search and rescue robot, believing it has detected a trapped earthquake victim, might stop to check for signs of life). Exploration domains are distinguished both by the prevalence of uncertainty and by the importance of intelligent information gathering. An exploring robot must understand what unknown information is most relevant to its goals, how to gather that information, and how to incorporate the results into its future actions. This thesis has two main components. First, we present planning algorithms that generate robot control policies for partially observable Markov decision process (POMDP) planning problems. POMDP models explicitly represent the uncertain state of the world using a probability distribution over possible states, and they allow the planner to reason about information gathering actions in a way that is decision theoretically optimal. Relative to existing POMDP planning algorithms, our algorithms can more quickly generate approximately optimal policies, taking advantage of innovations in efficient value function representation, heuristic search, and state abstraction. This improved POMDP planning is important both to exploration domains and to a wider class of decision problems. Second, we demonstrate the relevance of onboard science data analysis and POMDP planning to robotic exploration. Our experiments centered around a robot deployed to map the distribution of life in the Atacama Desert of Chile, using operational techniques similar to a Mars mission. We found that science autonomy and POMDP planning techniques significantly improved science yield for exploration tasks conducted both in simulation and onboard the robot.
BibTeX
@phdthesis{Smith-2007-9778,author = {Trey Smith},
title = {Probabilistic Planning for Robotic Exploration},
year = {2007},
month = {July},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-07-26},
}