Animal-Robot Interaction for Ethological Studies an Advanced Framework Based on Socially Integrated Mobile Robots

A robot properly introduced into an animal group, accepted by the animals, and capable of interacting with them is a very powerful tool for advanced ethological research, particularly in gregarious animals. Moreover, such robots can find an application in the management of farm animals and wildlife. This field of scientific research, often referred to as animal-robot interaction, has received attention only recently. Very few projects are completed or currently running and there is a lack of general methods and techniques. This is in part due to the challenges in managing such projects that need the latest expertise from multiple disciplines, notably biology and engineering. This thesis presents tools to run, monitor, and analyze experiments with mixed groups of animals and robots. As a model animal, we selected the domestic chicken, first because it is a well-studied animal, providing a solid biological knowledge base, and second because it is one of the most important farm animals. Our framework includes the following components: mobile robots, monitoring tools to observe and record experiments, tools to extract behavioral parameters from the recorded experimental data and tools to analyze and visualize results. The mobile robot we designed for experiments with the domestic chickens, the PoulBot, is a modification of the marXbot, a research robot developed in our laboratory at EPFL. We extended the standard marXbot configuration (providing, among other hardware, a locomotion base, an omnidirectional camera, a speaker, and an i.MX31 processor) with a color LEDs pattern module and a protective bumper. An array of 16 microphones with an acquisition board and a pecking device were also developed as extension modules. As a result, the PoulBot robot is able to use visual and acoustic communication channels – the two most important for birds. To make chickens accept the robot we used the filial imprinting mechanism. The monitoring tools play an essential role in ethological research that is often overlooked when planning experiments. For visual monitoring and recording, we used the open source tracking software SwisTrack whose component-based structure allowed us to implement the components necessary for chicken-robot experimentation. To detect the calling activity of chicks, we used a beamforming technique probabilistically coupled with the visual tracker. The robot design and tools were validated in several series of experiments. These experiments were mainly meant to demonstrate that the robot can be socially integrated into animal groups as a surrogate hen and to study how the overall group attraction to the robot is affected by the strength of individual attraction and how in turn that affects it. Six PoulBot robots were used in three principal series of experiments, each series lasting for one month. To extract trajectories of individual chicks from the recorded video data, we used a variational Bayesian Gaussian mixture model classification with a particle filters-based prediction of future positions of chicks. We also developed a set of MATLAB scripts to estimate and visualize relevant behavioral metrics (animal speed, robot-animal distances, etc.) that are further used to analyze and model animal behavior. We made a classifier to identify automatically if and how strongly a chick is imprinted on the robot. We believe that the presented framework extends the current state of the art in the field of animal-robot interaction. In the short term, the results obtained in our studies could become a foundation for designing novel intelligent robotic systems based on natural behaviors for high-throughput ethological laboratory studies and, in the long term, used in farming to improve the breeding conditions of poultry.

Mondada, Francesco
Lausanne, EPFL
Other identifiers:
urn: urn:nbn:ch:bel-epfl-thesis4981-6

Note: The status of this file is: EPFL only

 Record created 2011-01-06, last modified 2018-01-28

Rate this document:

Rate this document:
(Not yet reviewed)