Stable Spaces for Real-time Clothing

Project Members

Edilson de Aguiar (Disney Research Pittsburgh)
Leonid Sigal (Disney Research Pittsburgh)
Adrien Treuille (Carnegie Mellon University)
Jessica Hodgins (Disney Research Los Angeles, Disney Research Pittsburgh)

Our method enables the fast animation of detailed garments for human characters. We demonstrate that by using a simple, efficient technique, the motion of the clothing (skirts, dresses, shirts) for a thousand or more characters can be realistically computed in real-time.

We present a technique for learning clothing models that enables the simultaneous animation of thousands of detailed garments in real-time. This surprisingly simple conditional model learns and preserves the key dynamic properties of a cloth motion along with folding details. Our approach requires no a priori physical model, but rather treats training data as a ‘black box’. We show that the models learned with our method are stable over large time-steps and can approximately resolve cloth-body collisions. We also show that within a class of methods, no simpler model covers the full range of cloth dynamics captured by ours. Our method bridges the current gap between skinning and physical simulation, combining benefits of speed from the former with dynamic effects from the latter. We demonstrate our approach on a variety of apparel worn by male and female human characters performing a varied set of motions typically used in video games (e.g., walking, running, jumping, etc.).


Stable Spaces for Real-Time Clothing-Thumbnail

Stable Spaces for Real-time Clothing
July 25, 2010
Paper File [pdf, 1.05 MB]

Copyright Notice

The documents contained in these directories are included by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.