The feeling of success: Does touch sensing help predict grasp outcomes? - Robotics Institute Carnegie Mellon University

The feeling of success: Does touch sensing help predict grasp outcomes?

Roberto Calandra, Andrew Owens, Manu Upadhyaya, Wenzhen Yuan, Justin Lin, Edward Adelson, and Sergey Levine
Conference Paper, Proceedings of (CoRL) Conference on Robot Learning, pp. 314 - 323, November, 2017

Abstract

A successful grasp requires careful balancing of the contact forces. Deducing whether a particular grasp will be successful from indirect measurements, such as vision, is therefore quite challenging, and direct sensing of contacts through touch sensing provides an appealing avenue toward more successful and consistent robotic grasping. However, in order to fully evaluate the value of touch sensing for grasp outcome prediction, we must understand how touch sensing can influence outcome prediction accuracy when combined with other modalities. Doing so using conventional model-based techniques is exceptionally difficult. In this work, we investigate the question of whether touch sensing aids in predicting grasp outcomes within a multimodal sensing framework that combines vision and touch. To that end, we collected more than 9,000 grasping trials using a two-finger gripper equipped with GelSight high-resolution tactile sensors on each finger, and evaluated visuo-tactile deep neural network models to directly predict grasp outcomes from either modality individually, and from both modalities together. Our experimental results indicate that incorporating tactile readings substantially improve grasping performance.

BibTeX

@conference{Calandra-2017-119924,
author = {Roberto Calandra and Andrew Owens and Manu Upadhyaya and Wenzhen Yuan and Justin Lin and Edward Adelson and Sergey Levine},
title = {The feeling of success: Does touch sensing help predict grasp outcomes?},
booktitle = {Proceedings of (CoRL) Conference on Robot Learning},
year = {2017},
month = {November},
pages = {314 - 323},
}