An artificial life environment for autonomous virtual agents with multi-sensorial and multi-perceptive features
Our approach is based on the multi-sensory integration of the standard theory of neuroscience, where signals of a single object coming from distinct sensory systems are combined. The acquisition steps of signals, filtering, selection and simplification intervening before proprioception, active and predictive perception are integrated into virtual sensors and a virtual environment. We will focus on two aspects: 1) the assignment problem: determining which sensory stimuli belong to the same virtual object and (2) the sensory recoding problem: recoding signals in a common format before combining them. We have developed three novel methodologies to map the information coming from the virtual sensors of vision, audition and touch as well as that of the virtual environment in the form of a 'cognitive map'. Copyright © 2004 John Wiley and Sons, Ltd.
Virtual Reality Lab., Swiss Federal Institute of Technology (EPFL), 1015 Lausanne, Switzerland
Record created on 2007-01-16, modified on 2016-08-08