Infoscience

Thesis

Memory formation and recall in recurrent spiking neural networks

Our brain has the capacity to analyze a visual scene in a split second, to learn how to play an instrument, and to remember events, faces and concepts. Neurons underlie all of these diverse functions. Neurons, cells within the brain that generate and transmit electrical activity, communicate with each other through chemical synapses. These synaptic connections dynamically change with experience, a process referred to as synaptic plasticity, which is thought to be at the core of the brain's ability to learn and process the world in sophisticated ways. Our understanding of the rules of synaptic plasticity remains quite limited. To enable efficient computations among neurons or to serve as a trace of memory, synapses must create stable connectivity patterns between neurons. However there remains an insufficient theoretical explanation as to how stable connectivity patterns can form in the presence of synaptic plasticity. Since the dynamics of recurrently connected neurons depend upon their connections, which themselves change in response to the network dynamics, synaptic plasticity and network dynamics have to be treated as a compound system. Due to the nonlinear nature of the system this can be analytically challenging. Utilizing network simulations that model the interplay between the network connectivity and synaptic plasticity can provide valuable insights. However, many existing network models that implement biologically relevant forms of plasticity become unstable. This suggests that current models do not accurately describe the biological networks, which have no difficulty functioning without succumbing to exploding network activity. The instability in these network simulations could originate from the fact that theoretical studies have, almost exclusively, focused on Hebbian plasticity at excitatory synapses. Hebbian plasticity causes connected neurons that are active together to increase the connection strength between them. Biological networks, however, display a large variety of different forms of synaptic plasticity and homeostatic mechanisms, beyond Hebbian plasticity. Furthermore, inhibitory cells can undergo synaptic plasticity as well. These diverse forms of plasticity are active at the same time, and our understanding of the computational role of most of these synaptic dynamics remains elusive. This raises the important question as to whether forms of plasticity that have not been previously considered could -in combination with Hebbian plasticity- lead to stable network dynamics. Here we illustrate that by combining multiple forms of plasticity with distinct roles, a recurrently connected spiking network model self-organizes to distinguish and extract multiple overlapping external stimuli. Moreover we show that the acquired network structures remain stable over hours while plasticity is active. This long-term stability allows the network to function as an associative memory and to correctly classify distorted or partially cued stimuli. During intervals in which no stimulus is shown the network dynamically remembers the last stimulus as selective delay activity. Taken together this work suggest that multiple forms of plasticity and homeostasis on different timescales have to work together to create stable connectivity patterns in neuronal networks which enable them to perform relevant computation.

Related material