Simulation of Vision-based Tactile Sensors using Physics based Rendering - Robotics Institute Carnegie Mellon University

Simulation of Vision-based Tactile Sensors using Physics based Rendering

Arpit Agarwal, Timothy Man, and Wenzhen Yuan
Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, October, 2021

Abstract

Tactile sensing has seen a rapid adoption with the advent of vision-based tactile sensors. Vision-based tactile sensors provide high resolution, compact and inexpensive data to perform precise in-hand manipulation and human-robot interaction. However, the simulation of tactile sensors is still a challenge. In this paper, we built the first fully general optical tactile simulation system for a GelSight sensor using physics based rendering techniques. We propose physically accurate light models and show in-depth analysis of individual components of our simulation pipeline. Our system outperforms previous simulation techniques qualitatively and quantitative on image similarity metrics. Our code and experimental data is open-sourced at project page project page.

BibTeX

@conference{Agarwal-2021-130384,
author = {Arpit Agarwal and Timothy Man and Wenzhen Yuan},
title = {Simulation of Vision-based Tactile Sensors using Physics based Rendering},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2021},
month = {October},
keywords = {tactile sensors},
}