Learning Dynamic Tactile Sensing with Robust Vision-based Training
Abstract
Dynamic tactile sensing is a fundamental ability for recognizing materials and objects. However, while humans are born with partially developed dynamic tactile sensing and master this skill quickly, today’s robots remain in their infancy. The development of such a sense requires not only better sensors, but also the right algorithms to deal with these sensors’ data. For example, when classifying a material based on touch, the data is noisy, high-dimensional and contains irrelevant signals as well as essential ones. Few classification methods from machine learning can deal with such problems.
In this paper, we propose an efficient approach to inferring suitable lower-dimensional representations of the tactile data. In order to classify materials based on only the sense of touch, these representations are autonomously discovered using visual information of the surfaces during training. However, accurately pairing vision and tactile samples in real robot applications is a difficult problem. The proposed approach therefore works with weak pairings between the modalities. Experiments show that the resulting approach is very robust and yields significantly higher classification performance based on only dynamic tactile sensing.
BibTeX
@article{Kroemer-2011-112219,author = {Oliver Kroemer and Christoph H. Lampert and Jan Peters},
title = {Learning Dynamic Tactile Sensing with Robust Vision-based Training},
journal = {IEEE Transactions on Robotics},
year = {2011},
month = {June},
volume = {27},
number = {3},
pages = {545 - 557},
}