Face Locating and Tracking for Human-computer Interaction - Robotics Institute Carnegie Mellon University

Face Locating and Tracking for Human-computer Interaction

M. Hunke and Alex Waibel
Conference Paper, Proceedings of 28th Asilomar Conference on Signals, Systems and Computers (ACSSC '94), Vol. 2, pp. 1277 - 1281, October, 1994

Abstract

Effective human-to-human communication involves both auditory and visual modalities, providing robustness and naturalness in realistic communication situations. Recent efforts at our lab are aimed at providing such multimodal capabilities for human-machine communication. Most of the visual modalities require a stable image of a speaker's face. We propose a connectionist face tracker that manipulates camera orientation and room, to keep a person's face located at all times. The system operates in real time and can adapt rapidly to different lighting conditions, cameras and faces, making it robust against environmental variability. Extensions and integration of the system with a multimodal interface are presented.

BibTeX

@conference{Hunke-1994-16045,
author = {M. Hunke and Alex Waibel},
title = {Face Locating and Tracking for Human-computer Interaction},
booktitle = {Proceedings of 28th Asilomar Conference on Signals, Systems and Computers (ACSSC '94)},
year = {1994},
month = {October},
volume = {2},
pages = {1277 - 1281},
}