A Perceptual Control Space for Garment Simulation
Leonid Sigal (Disney Research Pittsburgh)
Moshe Mahler (Disney Research Pittsburgh)
Spencer Diaz (Disney Research Pittsburgh)
Kyna McIntosh (Disney Research Pittsburgh)
Elizabeth Carter (Disney Research Pittsburgh)
Timothy Richards (Walt Disney Animation Studios)
Jessica Hodgins (Disney Research Los Angeles, Disney Research Pittsburgh)
ACM SIGGRAPH 2015
July 26, 2015
We present a perceptual control space for simulation of cloth that works with any physical simulator, treating it as a black box. The perceptual control space provides intuitive, art-directable control over the simulation behavior based on a learned mapping from common descriptors for cloth (e.g., flowiness, softness) to the parameters of the simulation. To learn the mapping, we perform a series of perceptual experiments in which the simulation parameters are varied, and participants assess the values of the common terms of the cloth on a scale. A multi-dimensional sub-space regression is performed on the results to build a perceptual generative model over the simulator parameters. We evaluate the perceptual control space by demonstrating that the generative model does, in fact, create simulated clothing that is rated by participants as having the expected properties. We also show that this perceptual control space generalizes to garments and motions, not in the original experiments.
Download File "A Perceptual Control Space for Garment Simulation-Paper"
[pdf, 6.55 MB]