On the Sustained Tracking of Human Motion - Robotics Institute Carnegie Mellon University

On the Sustained Tracking of Human Motion

Conference Paper, Proceedings of 8th IEEE International Conference on Automatic Face and Gesture Recognition (FG '08), September, 2008

Abstract

Tracking humans requires consideration of a number of challenges which include articulated motion estimation, self occlusion, and varying appearance. In this paper, we propose an algorithm for sustained tracking of humans, where we combine frame-to-frame articulated motion estimation with a per-frame body detection algorithm. The proposed approach can automatically recover from tracking error and drift. The frame-to-frame motion estimation algorithm replaces traditional dynamic models within a filtering framework. Stable and accurate per-frame motion is estimated via a image-gradient based algorithm that solves a linear constrained least squares system. The per-frame detector learns appearance of different body parts and `sketches' expected gradient maps to detect discriminant pose configurations in images. The resulting online algorithm is computationally efficient and has been widely tested on a large dataset of sequences of drivers in vehicles. It shows stability and sustained accuracy over thousands of frames.

BibTeX

@conference{Sheikh-2008-10074,
author = {Yaser Ajmal Sheikh and Ankur Datta and Takeo Kanade},
title = {On the Sustained Tracking of Human Motion},
booktitle = {Proceedings of 8th IEEE International Conference on Automatic Face and Gesture Recognition (FG '08)},
year = {2008},
month = {September},
keywords = {Motion Estimation, Human Tracking, Human Detection},
}