We present a probabilistic approach to learn robust models of human motion through imitation. The association of Hidden Markov Model (HMM), Gaussian Mixture Regression (GMR) and dynamical systems allows us to extract redundancies across multiple demonstrations and build time-independent models to reproduce the dynamics of the demonstrated movements. The approach is first systematically evaluated and compared with other approaches by using generated trajectories sharing similarities with human gestures. Three applications on different types of robots are then presented. An experiment with the iCub humanoid robot acquiring a bimanual dancing motion is first presented to show that the system can also handle cyclic motion. An experiment with a 7 DOFs WAM robotic arm learning the motion of hitting a ball with a table tennis racket is presented to highlight the possibility to encode several variations of a movement in a single model. Finally, an experiment with a HOAP-3 humanoid robot learning to manipulate a spoon to feed the Robota humanoid robot is presented to demonstrate the capability of the system to handle several constraints simultaneously.