Unwrapping the eye for visible-spectrum gaze tracking on wearable devices - Robotics Institute Carnegie Mellon University

Unwrapping the eye for visible-spectrum gaze tracking on wearable devices

Bernardo R. Pires, Michäel Devyver, Akihiro Tsukada, and Takeo Kanade
Workshop Paper, IEEE Workshop on Applications of Computer Vision (WACV '13), pp. 369 - 376, January, 2013

Abstract

Wearable devices with gaze tracking can assist users in many daily-life tasks. When used for extended periods of time, it is desirable that such devices do not employ active illumination for safety reasons and to minimize interference from other light sources such as the sun. Most non active-illumination methods for gaze tracking attempt to locate the iris contour by fitting an ellipse. Although the camera projection causes the iris to appear as an ellipse in the eye image, it is actually a circle on the eye surface. Instead of searching for an ellipse in the eye image, the method proposed in this paper searches for a circle on the eye surface. To this end, the method calibrates a three-dimensional eye model based on the location of the corners of the eye. Using the 3D eye model, an input image is first transformed so that the eye's spherical surface is warped into a plane, thus “unwrapping” the eye. The iris circle is then detected on the unwrapped image by a three-step robust circle-fitting procedure. The location of the circle corresponds to the gaze orientation on the outside image. The method is fast to calibrate and runs in realtime. Extensive experimentation on embedded hardware and comparisons with alternative methods demonstrate the effectiveness of the proposed solution.

BibTeX

@workshop{Pires-2013-122435,
author = {Bernardo R. Pires and Michäel Devyver and Akihiro Tsukada and Takeo Kanade},
title = {Unwrapping the eye for visible-spectrum gaze tracking on wearable devices},
booktitle = {Proceedings of IEEE Workshop on Applications of Computer Vision (WACV '13)},
year = {2013},
month = {January},
pages = {369 - 376},
}