Files

Abstract

In the context of object interaction and manipulation, one characteristic of a robust grasp is its ability to comply with external perturbations applied to the grasped object while still maintaining the grasp. In this work we introduce an approach for grasp adaptation which learns a statistical model to adapt hand posture solely based on the perceived contact between the object and fingers. Using a multi-step learning procedure, the model dataset is built by first demonstrating an initial hand posture, which is then physically corrected by a human teacher pressing on the fingertips, exploiting compliance in the robot hand. The learner then replays the resulting sequence of hand postures, to generate a dataset of posture-contact pairs that are not influenced by the touch of the teacher. A key feature of this work is that the learned model may be further refined by repeating the correction-replay steps. Alternatively, the model may be reused in the development of new models, characterized by the contact signatures of a different object. Our approach is empirically validated on the iCub robot. We demonstrate grasp adaptation in response to changes in contact, and show successful model reuse and improved adaptation with additional rounds of model refinement.

Details

Actions

Preview