Abstract:
Touch is an essential sensing modality for making autonomous robots more dexterous and allowing them to work collaboratively with humans. In particular, the advent of vision-based tactile sensors has resulted in efforts to design them for different robotic manipulation tasks. However, this design task remains a challenging problem. This is for two reasons: first, the design of the sensor itself requires the compact integration of multiple optical elements to improve optical signal fidelity during interaction with the environment; second, the successful integration of vision-based tactile sensors into robotic manipulation tasks requires the co-design of both the sensors and the robot structure itself for optimal sensing and control.
This thesis aims to alleviate these two challenges by creating a general design framework that allows a roboticist to quickly iterate on the design and evaluation of vision-based tactile sensors for designated robotic manipulation tasks. First, our framework uses an optical simulator, based on physics-based rendering, that can accurately and efficiently generate the images captured by arbitrary sensor designs upon tactile indentation. Second, our framework uses a procedural sensor shape generator and introduces novel objective functions (perceptual and geometric) to improve tactile sensor designs automatically. Thirdly, we develop a general modular and interactive pipeline for rapid sensor prototyping with focus on novice users, that can automatically generate parameterized designs and optimize optical components within minutes. We provide an implementation of our framework as a design toolbox, OptiSense Studio.
We showcase the sim2real comparisons of a range of GelSight-like tactile sensors, specifically sensors that include optical components like mirrors, curved sensing surface, fluorescence, and light piping. Furthermore, we demonstrate the utility of our procedural sensor shape optimization for a curved tactile sensor shape and compare the optimized design performance against sensor expert’s hand-optimized designs by manufacturing a real-world prototype. We show the design of a new tactile sensor using our interactive toolbox and improve existing ones completely virtually.
This thesis tackles a critical problem in enabling the development of vision-based tactile sensors and consequently their adoption in sensing for diverse applications. Through this thesis, we demonstrate the utility of our design framework for the design of vision-based tactile sensors and compliant tactile sensors. More broadly, we want to create a new point of convergence between disparate communities such as computer graphics (physics-based rendering and simulation), optics (optical lens and material design), and robotics, and foster new research directions within and across these communities.
Thesis Committee Members:
Wenzhen Yuan, Co-chair, University of Illinois Urbana-Champaign
Ioannis Gkioulekas, Co-chair
Nancy Pollard
Edward Adelson, MIT