A Real-time Augmented Reality Surgical System for Overlaying Stiffness Information
Abstract
We describe a surgical system that autonomously searches for tumors and dynamically displays a computer graphic model of them super-imposed on the organ (or in our case, phantom). Once localized, the phantom is tracked in real time and augmented with overlaid stiffness information in 3D. We believe that such a system has the potential to quickly reveal the location and shape of tumors, and the visual overlay will reduce the cognitive overload of the surgeon. The contribution of this paper is the integration of disparate technologies to achieve this system. In fact, to the best of our knowledge, our approach is one of the first to incorporate state-of-the-art methods in registration, force sensing and tumor localization into a unified surgical system. First, the preoperative model is registered to the intra-operative scene using a Bingham distribution-based filtering approach. An active level set estimation is then used to find the location and shape of the tumors. We use a recently developed miniature force sensor to perform the palpation. The estimated stiffness map is then dynamically overlaid onto the registered preoperative model of the organ. We demonstrate the efficacy of our system by performing experiments on a phantom prostate model and other silicone organs with embedded stiff inclusions using the da Vinci research kit (dVRK).
BibTeX
@conference{Zevallos-2018-106050,author = {Nicolas Zevallos and Arun Srivatsan Rangaprasad and Hadi Salman and Lu Li and Jianing Qian and Saumya Saxena and Mengyun Xu and Kartik Patath and Howie Choset},
title = {A Real-time Augmented Reality Surgical System for Overlaying Stiffness Information},
booktitle = {Proceedings of Robotics: Science and Systems (RSS '18)},
year = {2018},
month = {June},
}