Learning Coupled Dynamical Systems from Human Demonstration for Robotic Eye-Arm-Hand Coordination
Efficient, adaptive and reliable visuomotor control system is crucial to enable robots to display flexibility in the face of changes in the environment. This paper takes inspiration in human eye-arm-hand coordination pattern to develop an equivalently robust robot controller. We recorded gaze, arm, hand, and trunk data from human subjects in reaching and grasping scenarios with/without obstacle in the workspace. An eye-arm-hand controller is developed, based on our extension of Coupled Dynamical Systems (CDS). We exploit the time-invariant properties of the CDS to allow fast adaptation to spatial and temporal perturbations during task completion. CDS global stability guarantees that the eye, the arm and the hand will reach the target in retinal, operational and grasp space respectively. When facing perturbations, the system can re-plan its actions almost instantly, without the need for an additional planning module. Coupling profiles for eye-arm and arm-hand systems can be modulated allowing to adjust the behavior of each slave system with respect to control signals flowing from the corresponding master system. We show how the CDS eye-arm-hand control framework can be used to handle the presence of obstacles in the workspace. The eye-arm-hand controller is validated in a series of experiments conducted with the iCub robot.
LUKIC_HUMANOIDS_2012.pdf
Postprint
openaccess
1.81 MB
Adobe PDF
21d0d3f345e74a1d273a38f1e858c290
LUKIC_HUMANOIDS_2012_2.mp4
openaccess
8.89 MB
Video MP4
2d44f145a9d32084584a630b945ce1f0