A perceptual control space for garment simulation - Robotics Institute Carnegie Mellon University

A perceptual control space for garment simulation

Leonid Sigal, Moshe Mahler, Spencer Diaz, Kyna McIntosh, Elizabeth Carter, Timothy Richards, and Jessica Hodgins
Journal Article, ACM Transactions on Graphics (TOG), Vol. 34, No. 4, July, 2015

Abstract

We present a perceptual control space for simulation of cloth that works with any physical simulator, treating it as a black box. The perceptual control space provides intuitive, art-directable control over the simulation behavior based on a learned mapping from common descriptors for cloth (e.g., flowiness, softness) to the parameters of the simulation. To learn the mapping, we perform a series of perceptual experiments in which the simulation parameters are varied and participants assess the values of the common terms of the cloth on a scale. A multi-dimensional sub-space regression is performed on the results to build a perceptual generative model over the simulator parameters. We evaluate the perceptual control space by demonstrating that the generative model does in fact create simulated clothing that is rated by participants as having the expected properties. We also show that this perceptual control space generalizes to garments and motions not in the original experiments.

BibTeX

@article{Sigal-2015-121976,
author = {Leonid Sigal and Moshe Mahler and Spencer Diaz and Kyna McIntosh and Elizabeth Carter and Timothy Richards and Jessica Hodgins},
title = {A perceptual control space for garment simulation},
journal = {ACM Transactions on Graphics (TOG)},
year = {2015},
month = {July},
volume = {34},
number = {4},
}