Real-Time Audio-Visual Calls Detection System for a Chicken Robot

Design, study, and control of mixed animals-robots societies is the field of scientific exploration that can bring new opportunities for study and control of groups of social animals. In the Chicken Robot project we develop a mobile robot, socially acceptable by chicks and able to interact with them using appropriate communication channels. For interaction purposes the robot has to know positions of all birds in an experimental area and detect those uttering calls. In this paper, we present an audio-visual approach to locate the chicks on a scene and detect their calling activity in the real-time. The visual tracking is provided by a marker-based tracker with a help of an overhead camera. Sound localization is achieved by the beamforming approach using an array of sixteen microphones. Visual and sound information are probabilistically mixed to detect the calling activity. The experiments using the e-puck robots instead of the real chicks demonstrate that our system is capable to detect the sound emission activity with more than 90% probability.

Published in:
Proceedings of the 4th International Conference on Advanced Robotics, 1-6
Presented at:
14th International Conference on Advanced Robotics (ICAR 2009), Munich, Germany, July 22-26, 2009

 Record created 2009-06-20, last modified 2020-04-20

External link:
Download fulltext
Rate this document:

Rate this document:
(Not yet reviewed)