Articulated Robot Motion for Simultaneous Localization and Mapping (ARM-SLAM)
Abstract
A robot with a hand-mounted depth sensor scans a scene. When the robot's joint angles are not known with certainty, how can it best reconstruct the scene? In this work, we simultaneously estimate the joint angles of the robot and reconstruct a dense volumetric model of the scene. In this way, we perform simultaneous localization and mapping in the configuration space of the robot, rather than in the pose space of the camera. We show using simulations and robot experiments that our approach greatly reduces both 3D reconstruction error and joint angle error over simply using the forward kinematics. Unlike other approaches, ours directly reasons about robot joint angles, and can use these to constrain the pose of the sensor. Because of this, it is more robust to missing or ambiguous depth data than approaches that are unconstrained by the robot's kinematics.
This was part of ICRA RA-L, meaning its in the ICRA conference, and also published in the RA journal.
BibTeX
@article{Klingensmith-2016-5479,author = {Matthew Klingensmith and Siddhartha Srinivasa and Michael Kaess},
title = {Articulated Robot Motion for Simultaneous Localization and Mapping (ARM-SLAM)},
journal = {IEEE Robotics and Automation Letters},
year = {2016},
month = {July},
volume = {1},
number = {2},
pages = {1156 - 1163},
}