Face Locating and Tracking for Human-computer Interaction
Conference Paper, Proceedings of 28th Asilomar Conference on Signals, Systems and Computers (ACSSC '94), Vol. 2, pp. 1277 - 1281, October, 1994
Abstract
Effective human-to-human communication involves both auditory and visual modalities, providing robustness and naturalness in realistic communication situations. Recent efforts at our lab are aimed at providing such multimodal capabilities for human-machine communication. Most of the visual modalities require a stable image of a speaker's face. We propose a connectionist face tracker that manipulates camera orientation and room, to keep a person's face located at all times. The system operates in real time and can adapt rapidly to different lighting conditions, cameras and faces, making it robust against environmental variability. Extensions and integration of the system with a multimodal interface are presented.
BibTeX
@conference{Hunke-1994-16045,author = {M. Hunke and Alex Waibel},
title = {Face Locating and Tracking for Human-computer Interaction},
booktitle = {Proceedings of 28th Asilomar Conference on Signals, Systems and Computers (ACSSC '94)},
year = {1994},
month = {October},
volume = {2},
pages = {1277 - 1281},
}
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.