Robust Real-Time Tracking Combining 3D Shape, Color, and Motion
Abstract
Real-time tracking algorithms often suffer from low accuracy and poor robustness when confronted with difficult, real-world data. We present a tracker that combines 3D shape, color (when available), and motion cues to accurately track moving objects in real-time. Our tracker allocates computational effort based on the shape of the posterior distribution. Starting with a coarse approximation to the posterior, the tracker successively refines this distribution, increasing tracking accuracy over time. The tracker can thus be run for any amount of time, after which the current approximation to the posterior is returned. Even at a minimum runtime of 0.37 milliseconds per object, our method outperforms all of the baseline methods of similar speed by at least 25% in RMS tracking error. If our tracker is allowed to run for longer, the accuracy continues to improve, and it continues to outperform all baseline methods. Our tracker is thus anytime, allowing the speed or accuracy to be optimized based on the needs of the application. By combining 3D shape, color (when available), and motion cues in a probabilistic framework, our tracker is able to robustly handle changes in viewpoint, occlusions, and lighting variations for moving objects of a variety of shapes, sizes, and distances.
BibTeX
@article{Held-2016-103055,author = {David Held and Jesse Levinson and Sebastian Thrun and Silvio Savarese},
title = {Robust Real-Time Tracking Combining 3D Shape, Color, and Motion},
journal = {International Journal of Robotics Research: Special Issue on RSS '14},
year = {2016},
month = {January},
volume = {35},
number = {1},
pages = {30 - 49},
}