Face classification using touch with a humanoid robot hand
This paper presents an experiment in which the iCub humanoid robot learns to recognize faces through proprioceptive information. We take inspiration in the way blind people recognize people's faces, i.e. through tactile exploration of the person's face. The iCub robot's tactile sensors are used to provide compliance in the hand motion so as to smoothly scan the facial features. The displacement of the fingers, as the robot explores the face, is used to build a model of the face using Hidden Markov Models. We show that the robot can successfully distinguish across the faces of a standard doll and the faces of three humanoid robots, the HOAP-3 robot, a Robota doll robot and MyDreamBaby, a commercial robotic doll.