Learning to Plan for Constrained Manipulation from Demonstrations
Abstract
Motion planning in high dimensional state spaces, such as for mobile manipulation, is a challenging problem. Constrained manipulation, e.g., opening articulated objects like doors or drawers, is also hard since sampling states on the constrained manifold is expensive. Further, planning for such tasks requires a combination of planning in free space for reaching a desired grasp or contact location followed by planning for the constrained manipulation motion, often necessitating a slow two step process in traditional approaches. In this work, we show that combined planning for such tasks can be dramatically accelerated by providing user demonstrations of the constrained manipulation motions. In particular, we show how such demonstrations can be incorporated into a recently developed framework of planning with experience graphs which encode and reuse previous experiences. We focus on tasks involving articulation constraints, e.g., door opening or drawer opening, where the motion of the object itself involves only a single degree of freedom. We provide experimental results with the PR2 robot opening a variety of such articulated objects using our approach, using full-body manipulation (after receiving kinesthetic demonstrations). We also provide simulated results highlighting the benefits of our approach for constrained manipulation tasks.
BibTeX
@article{Phillips-2016-109489,author = {Mike Phillips and Victor Hwang and Sachin Chitta and Maxim Likhachev},
title = {Learning to Plan for Constrained Manipulation from Demonstrations},
journal = {Autonomous Robots},
year = {2016},
month = {January},
volume = {40},
number = {1},
pages = {109 – 124},
}