Toward Natural Interactions With Assistive Robots - Robotics Institute Carnegie Mellon University
Loading Events

RI Seminar

September

22
Fri
Henny Admoni Associate Professor Robotics Institute,
Carnegie Mellon University
Friday, September 22
3:30 pm to 4:30 pm
NSH 1305
Toward Natural Interactions With Assistive Robots

Abstract
Robots can help people live better lives by assisting them with the complex tasks involved in everyday activities. This is especially impactful for people with disabilities, who can benefit from robotic assistance to increase their independence. For example, physically assistive robots can collaborate with people in preparing a meal, enabling people with motor impairments to be self sufficient in cooking and eating. Socially assistive robots can act as tutors, coaches, and partners, to help people with social or learning deficits practice the skills they have learned in a non-threatening environment. Developing effective human-robot interactions in these cases requires a multidisciplinary approach that involves fundamental robotics algorithms, insights from human psychology, and techniques from artificial intelligence and machine learning.

In this talk, I will describe my vision for robots that collaborate with and assist humans on complex tasks. I will explain how we can leverage our understanding of natural, intuitive human behaviors to detect when and how people need assistance, and then apply robotics algorithms to produce effective human-robot interactions. I explain how models of human attention, drawn from cognitive science, can help select robot behaviors that improve human performance on a collaborative task. I detail my work on algorithms that predict people’s mental states based on their eye gaze and provide assistance in response to those predictions. And I show how breaking the seamlessness of an interaction can make robots appear smarter. Throughout the talk, I will describe how techniques and knowledge from cognitive science help us develop robot algorithms that lead to more effective interactions between people and their robot partners.

Bio
Henny Admoni is an Assistant Professor in the Robotics Institute at Carnegie Mellon University, where she works on assistive robotics and human-robot interaction. Henny develops and studies intelligent robots that improve people’s lives by providing assistance through social and physical interactions. She studies how nonverbal communication, such as eye gaze and pointing, can improve assistive interactions by revealing underlying human intentions and increasing human-robot communication. Previously, Henny was a postdoctoral fellow at CMU with Siddhartha Srinivasa in the Personal Robotics Lab. Henny completed her PhD in Computer Science at Yale University with Professor Brian Scassellati. Her PhD dissertation was about modeling the complex dynamics of nonverbal behavior for socially assistive human-robot interaction. Henny holds an MS in Computer Science from Yale University, and a BA/MA joint degree in Computer Science from Wesleyan University. Henny’s scholarship has been recognized with awards such as the NSF Graduate Research Fellowship, the Google Anita Borg Memorial Scholarship, and the Palantir Women in Technology Scholarship.