EgoTouch: On-Body Touch Input Using AR/VR Headset Cameras - Robotics Institute Carnegie Mellon University
Loading Events

VASC Seminar

October

28
Mon
Vimal Mollyn PhD Student Human Computer Interaction Institute, Carnegie Mellon University
Monday, October 28
3:30 pm to 4:30 pm
3305 Newell-Simon Hall
EgoTouch: On-Body Touch Input Using AR/VR Headset Cameras
Abstract:  In augmented and virtual reality (AR/VR) experiences, a user’s arms and hands can provide a convenient and tactile surface for touch input. Prior work has shown on-body input to have significant speed, accuracy, and ergonomic benefits over in-air interfaces, which are common today. In this work, we demonstrate high accuracy, bare hands (i.e., no special instrumentation of the user) skin input using just an RGB camera, like those already integrated into all modern XR headsets. Our results show this approach can be accurate, and robust across diverse lighting conditions, skin tones, and body motion (e.g., input while walking). Finally, our pipeline also provides rich input metadata including touch force, finger identification, angle of attack, and rotation. We believe these are the requisite technical ingredients to more fully unlock on-skin interfaces that have been well motivated in the HCI literature but have lacked robust and practical methods.
Photo of Speaker.
 
Bio:  I’m a PhD student in the Future Interfaces Group at Carnegie Mellon University where I’m advised by Chris Harrison. I’m interested in creating new ways for people to interact with the world using my background in sensing and machine learning. Previously I graduated with a Bachelors and Masters from IIT Madras, where I majored in Engineering Design and Data Science.