Abstract:
Touch is an essential sensing modality for making autonomous robots more dexterous and works collaboratively with humans. With the advent of vision-based tactile sensors, roboticists have tried to incorporate tactile sensors in various robot structures for various robotic manipulation tasks to increase robustness, precision, and reliability. However, the design of vision-based tactile sensors is still a challenging problem, as it involves the compact integration of various optical elements to improve contrast during environment interaction. For the co-design of robot structure and sensing, there is a need for a framework for the design of tactile sensors in various shapes and sensing characteristics.
To this end, this thesis presents a general design framework to quickly iterate on the design and evaluation of vision-based tactile sensors. The framework is composed of a general physically accurate optical simulation, procedural sensor shape generation, and a novel design evaluation objective function.
Our optical simulation system leverages physics-based rendering techniques from computer graphics and allows us to generate realistic tactile images, given a sensor assembly. We first perform real-to-sim experiments to calibrate our simulation models. Our first completed work shows the utility of the optical simulation in generating accurate tactile images for the GelSight sensor with a flat sensing surface. We demonstrate that our simulation system can match the real-world measurements qualitatively and quantitatively. Our ongoing work introduces a general design framework for vision-based tactile sensors and showcases the design of a curved tactile sensor. The design framework consists of automatic sensor generation, a novel objective function, and an optimization pipeline to generate the best tactile sensor with given design constraints. Additionally, in the context of curved tactile sensors, we introduce a novel low-dimensional 2D curve parameterization for automatic sensor shape generation.
In the first proposed work, we leverage the proposed design framework for improving a soft robotic finger with integrated tactile sensing. To achieve this, we will develop an interactive design framework for selecting key design spaces and then leverage optimization to get the best design. In the second proposed work, we plan to improve an omnidirectional tactile sensor, GelSight360 for better perception at the center of the sensing surface. In the third proposed work, we want to design the illumination system of the Finray GelSight gripper to achieve the best perception. We want to investigate the various light types and their placement in the Finray gripper. Through this thesis, we demonstrate the utility of the proposed design framework in co-designing vision-based tactile sensors and soft robots.
Thesis Committee Members:
Wenzhen Yuan, Chair
Ioannis Gkioulekas
Nancy Pollard
Edward Adelson, MIT