Deep Learning for Tactile Sensing: Development to Deployment
Abstract: The role of sensing is widely acknowledged for robots interacting with the physical environment. However, few contemporary sensors have gained widespread use among roboticists. This thesis proposes a framework for incorporating sensors into a robot learning paradigm, from development to deployment, through the lens of ReSkin -- a versatile and scalable magnetic tactile sensor. [...]
Learning and Translating Temporal Abstractions of Behaviour across Humans and Robots
Abstract: Humans are remarkably adept at learning to perform tasks by imitating other people demonstrating these tasks. Key to this is our ability to reason abstractly about the high-level strategy of the task at hand (such as the recipe of cooking a dish) and the behaviours needed to solve this task (such as the behaviour [...]
Towards Underwater 3D Visual Perception
Abstract: With modern robotic technologies, seafloor imageries have become more accessible to both researchers and the public. This thesis leverages deep learning and 3D vision techniques to deliver valuable information from seafloor image observations. Despite the widespread use of deep learning and 3D vision algorithms across various fields, underwater imaging presents unique challenges, such as [...]
What Makes Learning to Control Easy or Hard?
Abstract: Designing autonomous systems that are simultaneously high-performing, adaptive, and provably safe remains an open problem. In this talk, we will argue that in order to meet this goal, new theoretical and algorithmic tools are needed that blend the stability, robustness, and safety guarantees of robust control with the flexibility, adaptability, and performance of machine [...]
Soft Wearable Haptic Devices for Ubiquitous Communication
Abstract: Haptic devices allow touch-based information transfer between humans and intelligent systems, enabling communication in a salient but private manner that frees other sensory channels. For such devices to become ubiquitous, their physical and computational aspects must be intuitive and unobtrusive. The amount of information that can be transmitted through touch is limited in large [...]
Robots That Know When They Don’t Know
Abstract: Foundation models from machine learning have enabled rapid advances in perception, planning, and natural language understanding for robots. However, current systems lack any rigorous assurances when required to generalize to novel scenarios. For example, perception systems can fail to identify or localize unfamiliar objects, and large language model (LLM)-based planners can hallucinate outputs that [...]