Learning to Perceive and Predict Everyday Interactions

GHC 6121

Abstract: This thesis aims to develop a computer vision system that can understand everyday human interactions with rich spatial information. Such systems can benefit VR/AR to perceive the reality and modify its virtual twin, and robotics to learn manipulation by watching human. Previous methods have been limited to constrained lab environment or pre-selected objects with [...]

Faculty Candidate: Wenshan Wang

Newell-Simon Hall 4305

Title: Towards General Autonomy: Learning from Simulation, Interaction, and Demonstration Abstract: Today's autonomous systems are still brittle in challenging environments or rely on designers to anticipate all possible scenarios to respond appropriately. On the other hand, leveraging machine learning techniques, robot systems are trained in simulation or the real world for various tasks. Due to [...]

From Videos to 4D Worlds and Beyond

Newell-Simon Hall 3305

Abstract:  Abstract: The world underlying images and videos is 3-dimensional and dynamic, i.e. 4D, with people interacting with each other, objects, and the underlying scene. Even in videos of a static scene, there is always the camera moving about in the 4D world. Accurately recovering this information is essential for building systems that can reason [...]