1:00 pm to 12:00 am
Event Location: NSH 1305
Abstract: People with upper extremity disabilities are gaining increased independence through the use of assisted devices such as wheelchair-mounted robotic arms. However, the increased capability and dexterity of these robotic arms also makes them challenging to control through accessible interfaces like joysticks, sip-and-puff, and buttons that are lower-dimensional than the control space of the robot. The potential for robotics autonomy to ease control burden within assistive domains has been recognized for decades. While full autonomy is an option, it removes all control from the user. When this is not desired by the human, the assistive technology in fact has made them less able and discards useful input the human might provide, leveraging for example their superior situational awareness, that would add to system robustness.
This thesis takes an in-depth dive into how to add autonomy to an assistive robot arm in the specific application of eating, to make it faster and more enjoyable for people with disabilities to feed themselves. While we are focused on this specific application, the tools and insights we gain can generalize to the fields of deformable object manipulation, selection from behavior libraries, intent prediction, robot teleoperation, and human-robot interaction. The nature of the physical proximity and heavy dependence on the robot arm for doing daily tasks creates a very high-stakes human-robot interaction.
We propose a system that is capable of fully autonomous feeding by (1) predicting bite timing based on social queues, (2) detecting relevant features of the food using RGBD sensor data, and (3) automatically selecting a goal and a food-collection motion primitive to bring a bite from the plate to the operator’s mouth. We propose investigating the desired level of autonomy through user studies with an assistive robot where users have varying degrees of control over the bite timing, bite selection, action selection, control mode-switching, and direct teleoperation of the robot to determine the effect on cognitive load, acceptance, trust, and task performance.
Committee:Siddhartha Srinivasa, Chair
Christopher Atkeson
Jodi Forlizzi
Leila Takayama, University of California, Santa Cruz