Abstract

Presents a human walking model built from experimental data based on a wide range of normalized velocities. The model is structured on two levels. On the first level, global spatial and temporal characteristics are generated. On the second level, a set of parameterized trajectories produce both the position of the body in space and the internal body configuration. This is performed for a standard structure and an average configuration of the human body. The experimental context corresponding to the model is extended by allowing a continuous variation of global spatial and temporal parameters according to the motion rendition expected by the animator. The model is based on a simple kinematic approach designed to keep the intrinsic dynamic characteristics of the experimental model. Such an approach also allows a personification of the walking action in an interactive real-time context in most cases. A correction automata of such motion is then proposed

Details

Actions