Animating Non-Humanoid Characters with Human Motion Data
Katsu Yamane, 1,2
Yuka Ariki, 1,3
Jessica Hodgins, 2,1
1, Disney Research, Pittsburgh
2, Carnegie Mellon University (CMU)
3, Nara Institute of Science and Technology, Japan
This paper presents a method for generating animations of non-humanoid characters from human motion capture data. Characters considered in this work have proportion and/or topology significantly different from humans, but are expected to convey expressions and emotions through body language that are understandable to human viewers. Keyframing is most commonly used to animate such characters. Our method provides an alternative for animating non-humanoid characters that leverages motion data from a human subject performing in the style of the target character.
The method consists of a statistical mapping function learned from a small set of corresponding key poses, and a physics-based optimization process to improve the physical realism.We demonstrate our approach on three characters and a variety of motions with emotional expressions.