Enhancing Facial Expression Classification By Information Fusion - Robotics Institute Carnegie Mellon University

Enhancing Facial Expression Classification By Information Fusion

I. Buciu, Z. Hammal, A. Caplier, N. Nikolaidis, and I. Pitas
Conference Paper, Proceedings of 14th European Signal Processing Conference (EUSIPCO '06), pp. 1 - 4, September, 2006

Abstract

The paper presents a system that makes use of the fusion information paradigm to integrate two different sorts of information in order to improve the facial expression classification accuracy over a single feature based classification one. The Discriminant Non-negative Matrix Factorization (DNMF) approach is used to extract a first set of features and an automatically geometrical-based feature extraction algorithm is used for retrieving the second set of features. These features are then concatenated into a single feature vector at feature level. Experiments showed that, when these mixed features are used for classification, the classification accuracy is improved compared with the case when only one type of these features is used.

BibTeX

@conference{Buciu-2006-120281,
author = {I. Buciu and Z. Hammal and A. Caplier and N. Nikolaidis and I. Pitas},
title = {Enhancing Facial Expression Classification By Information Fusion},
booktitle = {Proceedings of 14th European Signal Processing Conference (EUSIPCO '06)},
year = {2006},
month = {September},
pages = {1 - 4},
}