Carnegie Mellon University
AI for Human Mobility
Abstract This talk will describe a series of AI and robotics projects aimed at helping people independently move through cities and buildings. Projects include a deployed personalized transit information app, guide robots for people who are blind, and an integrated AI system that assists blind users with guidance and exploration. Specific findings will be presented [...]
Online-Adaptive Self-Supervised Learning with Visual Foundation Models for Autonomous Off-Road Driving
Abstract: Autonomous robot navigation in off-road environments currently presents a number of challenges. The lack of structure makes it difficult to handcraft geometry-based heuristics that are robust to the diverse set of scenarios the robot might encounter. Many of the learned methods that work well in urban scenarios require massive amounts of hand-labeled data, but [...]
Multimodal Representations for Adaptable Robot Policies in Human-Inhabited Spaces
Abstract: Human beings sense and express themselves through multiple modalities. To capture multimodal ways of human communication, I want to build adaptable robot policies that infer task pragmatics from video and language prompts, reason about sounds and other sensors, take actions, and learn mannerisms of interacting with people and objects. Existing solutions for robot policies [...]