Combining 3D Shape, Color, and Motion for Robust Anytime Tracking
Abstract
Although object tracking has been studied for decades, real-time tracking algorithms often suffer from low accuracy and poor robustness when confronted with difficult, real-world data. We present tracker that combines 3D shape, color (when available), and motion cues to accurately track moving objects in real-time. Our tracker allocates computational effort based on the shape of the posterior distribution. Starting with a coarse approximation to the posterior, the tracker successively refines this distribution, increasing tracking accuracy over time. The tracker can thus be run for any amount of time, after which the current approximation to the posterior is returned. Even at a minimum runtime of 0.7 milliseconds, our method outperforms all of the baseline methods of similar speed by at least 10%. If our tracker is allowed to run for longer, the accuracy continues to improve, and it continues to outperform all baseline methods. Our tracker is thus anytime, allowing the speed or accuracy to be optimized based on the needs of the application.
BibTeX
@conference{Held-2014-103057,author = {David Held and Jesse Levinson and Sebastian Thrun and Silvio Savarese},
title = {Combining 3D Shape, Color, and Motion for Robust Anytime Tracking},
booktitle = {Proceedings of Robotics: Science and Systems (RSS '14)},
year = {2014},
month = {July},
}