Face classification using touch with a humanoid robot hand
This paper presents an experiment in which the iCub humanoid robot learns to recognize faces through proprioceptive information. We take inspiration in the way blind people recognize people's faces, i.e. through tactile exploration of the person's face. The iCub robot's tactile sensors are used to provide compliance in the hand motion so as to smoothly scan the facial features. The displacement of the fingers, as the robot explores the face, is used to build a model of the face using Hidden Markov Models. We show that the robot can successfully distinguish across the faces of a standard doll and the faces of three humanoid robots, the HOAP-3 robot, a Robota doll robot and MyDreamBaby, a commercial robotic doll.
FacesRecognitionForSubmission.avi
openaccess
9.25 MB
AVI
984d00d187f38682a34d00c48ea83c1b
TFRpaper.pdf
openaccess
931.07 KB
Adobe PDF
61941b4b35b22f5758d43c08904db461