Files

Abstract

A biologically inspired computational model of rodent repre-sentation?based (locale) navigation is presented. The model combines visual input in the form of realistic two dimensional grey-scale images and odometer signals to drive the firing of simulated place and head direction cells via Hebbian synapses. The space representation is built incrementally and on-line without any prior information about the environment and consists of a large population of location-sensitive units (place cells) with overlapping receptive fields. Goal navigation is performed using reinforcement learning in continuous state and action spaces, where the state space is represented by population activity of the place cells. The model is able to reproduce a number of behavioral and neuro-physiological data on rodents. Performance of the model was tested on both simulated and real mobile Khepera robots in a set of behavioral tasks and is comparable to the performance of animals in similar tasks.

Details

Actions

Preview