Recognizing social touch gestures using recurrent and convolutional neural networks - Robotics Institute Carnegie Mellon University

Recognizing social touch gestures using recurrent and convolutional neural networks

Dana Hughes, Alon Krauthammer, and Nikolaus Correll
Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, pp. 2315 - 2321, May, 2017

Abstract

Deep learning approaches have been used to perform classification in several applications with high-dimensional input data. In this paper, we investigate the potential for deep learning for classifying affective touch on robotic skin in a social setting. Three models are considered, a convolutional neural network, a convolutional-recurrent neural network and an autoencoder-recurrent neural network. These models are evaluated on two publicly available affective touch datasets, and compared with models built to classify the same datasets. The deep learning approaches provide a similar level of accuracy, and allows gestures to be predicted in real-time at a rate of 6 to 9 Hertz. The memory requirements of the models demonstrate that they can be implemented on small, inexpensive microcontrollers, demonstrating that classification can be performed in the skin itself by collocating computing elements with the sensor array.

BibTeX

@conference{Hughes-2017-126383,
author = {Dana Hughes and Alon Krauthammer and Nikolaus Correll},
title = {Recognizing social touch gestures using recurrent and convolutional neural networks},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2017},
month = {May},
pages = {2315 - 2321},
}