A multi-sensor fusion system for moving object detection and tracking in urban driving environments
Abstract
A self-driving car, to be deployed in real-world driving environments, must be capable of reliably detecting and effectively tracking of nearby moving objects. This paper presents our new, moving object detection and tracking system that extends and improves our earlier system used for the 2007 DARPA Urban Challenge. We revised our earlier motion and observation models for active sensors (i.e., radars and LIDARs) and introduced a vision sensor. In the new system, the vision module detects pedestrians, bicyclists, and vehicles to generate corresponding vision targets. Our system utilizes this visual recognition information to improve a tracking model selection, data association, and movement classification of our earlier system. Through the test using the data log of actual driving, we demonstrate the improvement and performance gain of our new tracking system.
BibTeX
@conference{Cho-2014-126198,author = {Hyunggi Cho and Young-Woo Seo and BVK Vijaya Kumar and Ragunathan Raj Rajkumar},
title = {A multi-sensor fusion system for moving object detection and tracking in urban driving environments},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2014},
month = {May},
pages = {1836 - 1843},
}