Data-driven finger motion synthesis for gesturing characters - Robotics Institute Carnegie Mellon University

Data-driven finger motion synthesis for gesturing characters

Sophie Jörg, Jessica Hodgins, and Alla Safonova
Journal Article, ACM Transactions on Graphics (TOG), Vol. 31, No. 6, November, 2012

Abstract

Capturing the body movements of actors to create animations for movies, games, and VR applications has become standard practice, but finger motions are usually added manually as a tedious post-processing step. In this paper, we present a surprisingly simple method to automate this step for gesturing and conversing characters. In a controlled environment, we carefully captured and post-processed finger and body motions from multiple actors. To augment the body motions of virtual characters with plausible and detailed finger movements, our method selects finger motion segments from the resulting database taking into account the similarity of the arm motions and the smoothness of consecutive finger motions. We investigate which parts of the arm motion best discriminate gestures with leave-one-out cross-validation and use the result as a metric to select appropriate finger motions. Our approach provides good results for a number of examples with different gesture types and is validated in a perceptual experiment.

BibTeX

@article{Jorg-2012-121990,
author = {Sophie Jörg and Jessica Hodgins and Alla Safonova},
title = {Data-driven finger motion synthesis for gesturing characters},
journal = {ACM Transactions on Graphics (TOG)},
year = {2012},
month = {November},
volume = {31},
number = {6},
}