Tactile Gestures for Human/Robot Interaction - Robotics Institute Carnegie Mellon University

Tactile Gestures for Human/Robot Interaction

Richard Voyles and Pradeep Khosla
Conference Paper, Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 3, pp. 7 - 13, August, 1995

Abstract

Gesture-Based Programming is a new paradigm to ease the burden of programming robots. By tapping in to the user's wealth of experience with contact transitions, compliance, uncertainty and operations sequencing, we hope to provide a more intuitive programming environment for complex, real-world tasks based on the expressiveness of non-verbal communication. A requirement for this to be accomplished is the ability to interpret gestures to infer the intentions behind them. As a first step toward this goal, this paper presents an application of distributed perception for inferring a user's intentions by observing tactile gestures. These gestures consist of sparse, inexact, physical "nudges" applied to the robot's end effector for the purpose of modifying its trajectory in free space. A set of independent agents - each with its own local, fuzzified, heuristic model of a particular trajectory parameter - observes data from a wrist force/torque sensor to evaluate the gestures. The agents then independently determine the confidence of their respective findings and distributed arbitration resolves the interpretation through voting.

BibTeX

@conference{Voyles-1995-13947,
author = {Richard Voyles and Pradeep Khosla},
title = {Tactile Gestures for Human/Robot Interaction},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {1995},
month = {August},
volume = {3},
pages = {7 - 13},
}