Student Talks
Flexible Perception for High-Performance Robot Navigation
Abstract: Real-world autonomy requires perception systems that deliver rich, accurate information given the task and environment. However, as robots scale to diverse and rapidly evolving settings, maintaining this level of performance becomes increasingly brittle and labor-intensive, requiring significant human engineering and retraining for even small changes in environment and problem definition. To overcome this bottleneck, [...]
Learning Bayesian Experimental Design Policies Efficiently and Robustly
Abstract: Bayesian Experimental Design (BED) provides a principled framework for sequential data-collection under uncertainty, and is used in a wide set of domains such as clinical trials, ecological monitoring, and hyperparameter optimization. Despite its wide applicability, BED methods remain challenging to deploy in practice due to their significant computational demands. This thesis addresses these computational [...]
Unlocking Robust Spatial Perception: Resilient State Estimation and Mapping for Long-term Autonomy
Abstract: How can we enable robots to perceive, adapt, and understand their surroundings like humans—in real-time and under uncertainty? Just as humans rely on vision to navigate complex environments, robots need robust and intelligent perception systems—“eyes” that can endure sensor degradation, adapt to changing conditions, and recover from failure. However, today’s visual systems are fragile—easily [...]
From Pixels to Physical Intelligence: Semantic 3D Data Generation at Internet Scale
Abstract: Modern AI won’t achieve physical intelligence until it can extract rich, semantic spatial knowledge from the wild ocean of internet video—not just curated motion-capture datasets or expensive 3D scans. This thesis proposes a self-bootstrapping pipeline for converting raw pixels into large-scale 3D and 4D spatial understanding. It begins with multi-view bootstrapping: using just two [...]
Self supervised perception for Tactile Dexterity
Abstract: Humans are incredibly dexterous. We interact with and manipulate tools effortlessly, leveraging touch without giving it a second thought. Yet, replicating this level of dexterity in robots, is a major challenge. While the robotics community, recognizing the importance of touch in fine manipulation, has developed a wide variety of tactile sensors, how best to [...]
Prompt-to-Product: Generative Assembly via Bimanual Manipulation
Abstract: Assembly products are ubiquitous in our lives, for example, chairs, tables, couches, drawers, and more. Due to the complex interactions between components, creating such products typically demands significant manual effort in 1) designing the assembly and 2) constructing the product. This thesis seeks to reduce the required manual effort by automating the creation process [...]
Differentiable Probabilistic Inference and Rendering for Multimodal Robotic Perception
Abstract: Robots are increasingly deployed to automate tasks that are dangerous or mundane for humans such as search and rescue, mapping, and inspection in difficult environments. They rely on their perception stack, typically composed of complementary sensing modalities, to estimate their own state and the state of the environment to enable informed decision-making. This thesis [...]