Eye gaze for assistive manipulation
Abstract
A key challenge of human-robot collaboration is to build systems that balance the usefulness of autonomous robot behaviors with the benefits of direct human control. This balance is especially relevant for assistive manipulation systems, which promise to help people with disabilities more easily control wheelchair-mounted robot arms to accomplish activities of daily living. To provide useful assistance, robots must understand the user's goals and preferences for the task. Our insight is that systems can enhance this understanding by monitoring the user's natural eye gaze behavior, as psychology research has shown that eye gaze is responsive and relevant to the task. In this work, we show how using gaze enhances assistance algorithms. First, we analyze eye gaze behavior during teleoperated robot manipulation and compare it to literature results on by-hand manipulation. Then, we develop a pipeline for combining the raw eye gaze signal with the task context to build a rich signal for learning algorithms. Finally, we propose a novel use of eye gaze in which the robot avoids risky behavior by detecting when the user believes that the robot's behavior has a problem.
BibTeX
@conference{Aronson-2020-126624,author = {Reuben M. Aronson and Henny Admoni},
title = {Eye gaze for assistive manipulation},
booktitle = {Proceedings of Companion of the ACM/IEEE International Conference on Human-Robot Interaction (HRI '20)},
year = {2020},
month = {March},
pages = {552 - 554},
}