4:30 pm to 12:00 am
Event Location: NSH 3305
Bio: Kaijen Hsiao received her B.S.E. degree in Mechanical Engineering in 2002
from Princeton University and her Ph.D. degree in Computer Science in 2009
from MIT, where she worked with Tomas Lozano-Perez and Leslie Kaelbling. She
is now a research scientist at Willow Garage, where her research interests
lie in grasping and manipulation.
Abstract: Robotic manipulation of objects is much more difficult in unstructured
environments (such as peoples’ homes) than in structured ones (such as
factories) because of the presence of uncertainty. Uncertainty comes in many
forms in manipulation tasks: uncertainty in object identities, poses, or
shapes, or in robot poses or shapes, for instance. In this talk, I will
discuss my thesis work on grasping objects of known shape robustly under
significant uncertainty in object pose. To reason explicitly about
uncertainty while grasping, we model the problem as a partially observable
Markov decision process (POMDP). We derive a closed-loop strategy that
maintains a belief state (a probability distribution over world states), and
select actions with a receding horizon using forward search through the
belief space. Our actions are world-relative trajectories (WRT): fixed
trajectories expressed relative to the most-likely state of the world. We
localize the object, ensure its reachability, and robustly grasp it at a
specified position by using information-gathering, reorientation, and
goal-seeking WRT actions. This framework is used to grasp objects (including
a power drill and a Brita pitcher) despite significant pose uncertainty,
using a 7-DOF Barrett Arm and attached 4-DOF Barrett Hand equipped with
force and contact sensors. I will also present more recent work on
reactively adjusting grasps in a model-free way using tactile sensors during
grasp execution, as well as work on selecting grasps under uncertainty in
object shape, in which we probabilistically combine results from multiple
grasp planners/evaluators on multiple potential object representations to
select grasps that are most likely to work given all the possible object
hypotheses. Both are demonstrated using the PR2 robot from Willow Garage.