Real-Time Registration of Video with Ultrasound using Stereo Disparity - Robotics Institute Carnegie Mellon University

Real-Time Registration of Video with Ultrasound using Stereo Disparity

Jihang Wang
Miscellaneous, Carnegie Mellon University, May, 2012

Abstract

Enabling an ultrasound machine to scan the patient with an additional, new view of the external anatomy’s 3D color surface combined with the traditional interior ultrasound image has the potential to revolutionize many areas such as plastic surgery and eye surgery that may benefit from visualization of ultrasound registered with the patient’s surface. Critical to the implementation of this kind of visualization technique, in general, are the capabilities to accurately acquire the 3D point clouds of the exterior surface along with high resolution ultrasound data and then to register them properly into one coordinate system. This report proposes an image fusion device called Probe-Sight to realize the above functionality by mounting two video cameras on an ultrasound probe in alightweight aluminum frame. The motivation of the Probe-Sight device is discussed first, including its initial application of giving plastic surgeons better understanding of nerve regeneration after surgery. Some background knowledge for augmented reality and an early version of Probe-Sight are discussed next in the literature review. The report then demonstrates how Probe-Sight’s stereo vision algorithms work by introducing the foundation of stereopsis including stereo camera calibration, epipolar geometry and stereo rectification. Different algorithms for matching the stereo points are then discussed, followed by our implementation using GPU based methods from the Open-source Computer Vision library (OpenCV). In addition, our image rendering framework is introduced, which registers both 3D exterior information and interior ultrasound data together in real time. We report on the successful operation of our device, demonstrating a 3D rendering of an ultrasound phantom’s surface with the ultrasound data superimposed at its correct relative location. Eventually, automated analysis of these registered data sets may permit the scanner and its associated computational apparatus to interpret the ultrasound data within its anatomical context, much as the human operator does today.

BibTeX

@misc{Wang-2012-7495,
author = {Jihang Wang},
title = {Real-Time Registration of Video with Ultrasound using Stereo Disparity},
publisher = {Master's Thesis, Department of Biomedical Engineering, Carnegie Mellon University},
school = {Biomedical Engineering , Carnegie Mellon University},
month = {May},
year = {2012},
address = {Pittsburgh, PA},
}