Stable spaces for real-time clothing - Robotics Institute Carnegie Mellon University

Stable spaces for real-time clothing

Edilson De Aguiar, Leonid Sigal, Adrien Treuille, and Jessica K. Hodgins
Journal Article, ACM Transactions on Graphics (TOG), Vol. 29, No. 4, July, 2010

Abstract

We present a technique for learning clothing models that enables the simultaneous animation of thousands of detailed garments in real-time. This surprisingly simple conditional model learns and preserves the key dynamic properties of a cloth motion along with folding details. Our approach requires no a priori physical model, but rather treats training data as a "black box." We show that the models learned with our method are stable over large time-steps and can approximately resolve cloth-body collisions. We also show that within a class of methods, no simpler model covers the full range of cloth dynamics captured by ours. Our method bridges the current gap between skinning and physical simulation, combining benefits of speed from the former with dynamic effects from the latter. We demonstrate our approach on a variety of apparel worn by male and female human characters performing a varied set of motions typically used in video games (e.g., walking, running, jumping, etc.).

BibTeX

@article{De-2010-122005,
author = {Edilson De Aguiar and Leonid Sigal and Adrien Treuille and Jessica K. Hodgins},
title = {Stable spaces for real-time clothing},
journal = {ACM Transactions on Graphics (TOG)},
year = {2010},
month = {July},
volume = {29},
number = {4},
}