Abstract

Motion blending is widely accepted as a standard technique in computer animation, allowing the generation of new motions by interpolation and/or transition between motion capture sequences. To ensure smooth and seamless results, an important property has to be taken into account: similar constraints sequences have to be time aligned. But traditional blending approaches let the user choose manually the transition time and duration. In addition, according to the animation context, blending operations should not be performed immediately. They can only occur during a precise period of time, while preserving specific physical properties. We present in this paper an improved blending technique allowing automatic controlled transition between motion patterns whose parameters are not known in advance. This approach ensures coherent movements over the parameter space of the original input motions. To illustrate our approach, we focus on walking and running motions blended with jumps, where animators may vary the jump length and style. The proposed method specifies automatically the support phases of the input motions, and controls on the fly a correct transition time. Moreover the current locomotion type and speed are smoothly adapted given a specific jump type and length

Details

Actions