Ultrasound tracking using ProbeSight: Camera pose estimation relative to external anatomy by inverse rendering of a prior high-resolution 3D surface map
Abstract
This paper addresses the problem of freehand ultrasound probe tracking without requiring an external tracking device, by mounting a video camera on the probe to identify location relative to the patient's external anatomy. By pre-acquiring a high-resolution 3D surface map as an atlas of the anatomy, we eliminate the need for artificial skin markers. We use an OpenDR pipeline for inverse rendering and pose estimation via matching the real-time camera image with the 3D surface map. We have addressed the problem of distinguishing rotation from translation by including an inertial navigation system to accurately measure rotation. Experiments on both a phantom containing an image of human skin (palm) as well as actual human skin (fingers, palm, and wrist) validate the effectiveness of our approach. For ultrasound, this will permit the compilation of 3D ultrasound data as the probe is moved, as well as comparison of real-time ultrasound scans registered with previous scans from the same anatomical location. In a broader sense, tools that know where they are by looking at the patient's exterior could have broad beneficial impact on clinical medicine.
http://www.pdf-express.org/Conf/37755X/versions/807638/PID4620243.pdf
BibTeX
@conference{Wang-2017-104391,author = {J. Wang and C. Che and J. Galeotti and S. Horvath and V. Gorantla and G. Stetten},
title = {Ultrasound tracking using ProbeSight: Camera pose estimation relative to external anatomy by inverse rendering of a prior high-resolution 3D surface map},
booktitle = {Proceedings of IEEE Winter Conference on Applications of Computer Vision (WACV '17)},
year = {2017},
month = {March},
pages = {825 - 833},
}