Imitating human movement with teleoperated robotic head - Robotics Institute Carnegie Mellon University

Imitating human movement with teleoperated robotic head

Priyanshu Agarwal, Samer Al Moubayed, Alexander Alspach, Joohyung Kim, Elizabeth J. Carter, Jill Fain Lehman, and Katsu Yamane
Conference Paper, Proceedings of 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN '16), pp. 630 - 637, August, 2016

Abstract

Effective teleoperation requires real-time control of a remote robotic system. In this work, we develop a controller for realizing smooth and accurate motion of a robotic head with application to a teleoperation system for the Furhat robot head [1], which we call TeleFurhat. The controller uses the head motion of an operator measured by a Microsoft Kinect 2 sensor as reference and applies a processing framework to condition and render the motion on the robot head. The processing framework includes a pre-filter based on a moving average filter, a neural network-based model for improving the accuracy of the raw pose measurements of Kinect, and a constrained-state Kalman filter that uses a minimum jerk model to smooth motion trajectories and limit the magnitude of changes in position, velocity, and acceleration. Our results demonstrate that the robot can reproduce the human head motion in real time with a latency of approximately 100 to 170 ms while operating within its physical limits. Furthermore, viewers prefer our new method over rendering the raw pose data from Kinect.

BibTeX

@conference{Agarwal-2016-122477,
author = {Priyanshu Agarwal and Samer Al Moubayed and Alexander Alspach and Joohyung Kim and Elizabeth J. Carter and Jill Fain Lehman and Katsu Yamane},
title = {Imitating human movement with teleoperated robotic head},
booktitle = {Proceedings of 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN '16)},
year = {2016},
month = {August},
pages = {630 - 637},
}