Gaze Estimation - Robotics Institute Carnegie Mellon University
Graphical depiction of the Gaze Estimation project
Gaze Estimation

Passive gaze estimation is usually performed by locating the pupils, and the inner and outer eye corners in the image of the driver’s head. Of these feature points, the eye corners are just as important, and perhaps harder to detect, than the pupils. The eye corners are usually found using local feature detectors and trackers. On the other hand, we have built a passive gaze tracking system which uses a global head model, specifically an Active Appearance Model (AAM), to track the whole head. From the AAM, the eye corners, eye region, and head pose are robustly extracted and then used to estimate the gaze. See here for more details of our AAM tracking algorithms. See the movies below for examples of our results.


Movie of drivers head from interior camera

Corresponding movie of external scene
Displaying 2 Publications

2004
Takahiro Ishikawa, Simon Baker, Iain Matthews, and Takeo Kanade
Conference Paper, Proceedings of 11th World Congress on Intelligent Transportation Systems, October, 2004
Takahiro Ishikawa, Simon Baker, Iain Matthews, and Takeo Kanade
Tech. Report, CMU-RI-TR-04-08, Robotics Institute, Carnegie Mellon University, February, 2004

past head

  • Simon Baker

past staff

  • Takahiro Ishikawa

past contact

  • Simon Baker