Learning and Translating Temporal Abstractions of Behaviour across Humans and Robots - Robotics Institute Carnegie Mellon University
Loading Events

PhD Thesis Defense

August

29
Thu
Tanmay Shankar PhD Student Robotics Institute,
Carnegie Mellon University
Thursday, August 29
11:00 am to 1:00 pm
NSH 4305
Learning and Translating Temporal Abstractions of Behaviour across Humans and Robots

Abstract:
Humans are remarkably adept at learning to perform tasks by imitating other people demonstrating these tasks. Key to this is our ability to reason abstractly about the high-level strategy of the task at hand (such as the recipe of cooking a dish) and the behaviours needed to solve this task (such as the behaviour of pouring liquid into a pan), while ignoring irrelevant details (such as the precise angle at which to pour).

In this talk, we describe steps towards imbibing robots with these abilities; i.e. to learn and translate temporal abstractions of behaviour across humans and robots. We first explore the question “How can we learn and represent temporal abstractions of agent behaviours and their effects on their environment?”, briefly presenting work that adopts a representation learning perspective to skill learning. We then address the question “How can we understand demonstrator task strategies in terms of these abstractions, and translate these to corresponding abstractions for a robot to execute?”, briefly presenting work that adopts an unsupervised translation approach to transferring abstractions of behaviours across humans and robots.

The final component of our thesis aims to revisit these questions; addressing translating temporal abstractions of behaviour from humans to robots from a perspective of imitating desired environmental change. Towards this, we introduce TransAct, a framework that first extends our prior skill learning work to learn temporally abstract representations of agent-environment interactions. TransAct empowers robots to consume in-domain human task demonstrations, retrieve and compose corresponding robot-environment interactions with similar environmental effects to perform similar tasks themselves in a zero shot manner, without access to paired demonstrations or dense annotations.

To conclude the talk, we offer a broader perspective on the paradigm of unsupervised translation of temporal abstractions of behaviour across humans and robots. We discuss both alternate application domains (including potential applications to motivation in physical therapy via robot artists), and future research directions enabled by our work.

Thesis Committee Members:
Jean Oh, Chair
David Held
Shubham Tulsiani
Amy Zhang, The University of Texas at Austin

More Information