Real-Time Audio-Visual Calls Detection System for a Chicken Robot
Design, study, and control of mixed animals-robots societies is the field of scientific exploration that can bring new opportunities for study and control of groups of social animals. In the Chicken Robot project we develop a mobile robot, socially acceptable by chicks and able to interact with them using appropriate communication channels. For interaction purposes the robot has to know positions of all birds in an experimental area and detect those uttering calls. In this paper, we present an audio-visual approach to locate the chicks on a scene and detect their calling activity in the real-time. The visual tracking is provided by a marker-based tracker with a help of an overhead camera. Sound localization is achieved by the beamforming approach using an array of sixteen microphones. Visual and sound information are probabilistically mixed to detect the calling activity. The experiments using the e-puck robots instead of the real chicks demonstrate that our system is capable to detect the sound emission activity with more than 90% probability.
WOS:000270815500058
2009
1
6
REVIEWED
Event name | Event place | Event date |
Munich, Germany | July 22-26, 2009 | |