Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation - Robotics Institute Carnegie Mellon University

Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation

Simon Baker, Iain Matthews, Jing Xiao, Ralph Gross, Takeo Kanade, and Takahiro Ishikawa
Conference Paper, Proceedings of 11th World Congress on Intelligent Transportation Systems, October, 2004

Abstract

The non-rigid motion of a driver's head (i.e. the motion of their mouth, eye-brows, cheeks, etc) can tell us a lot about their mental state; e.g. whether they are drowsy, alert, aggressive, comfortable, tense, distracted, etc. In this paper, we describe our recent research on non-rigid face tracking. In particular, we present both 2D and 3D algorithms for tracking the non-rigid motion of the driver's head using an Active Appearance Model. Both algorithms operate at over 200 frames per second. We also present algorithms for converting a 2D model into a 3D model and for fitting with occlusion and large pose variation. Notes

BibTeX

@conference{Baker-2004-9044,
author = {Simon Baker and Iain Matthews and Jing Xiao and Ralph Gross and Takeo Kanade and Takahiro Ishikawa},
title = {Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation},
booktitle = {Proceedings of 11th World Congress on Intelligent Transportation Systems},
year = {2004},
month = {October},
}