Abstract

Bottom-up models of functionally relevant patterns of neural activity provide an explicit link between neuronal dynamics and computation. A prime example of functional activity patterns are propagating bursts of place-cell activities called hippocampal replay, which is critical for memory consolidation. The sudden and repeated occurrences of these burst states during ongoing neural activity suggest metastable neural circuit dynamics. As metastability has been attributed to noise and/or slow fatigue mechanisms, we propose a concise mesoscopic model which accounts for both. Crucially, our model is bottom-up: it is analytically derived from the dynamics of finite-size networks of Linear-Nonlinear Poisson neurons with short-term synaptic depression. As such, noise is explicitly linked to stochastic spiking and network size, and fatigue is explicitly linked to synaptic dynamics. To derive the mesoscopic model, we first consider a homogeneous spiking neural network and follow the temporal coarse-graining approach of Gillespie to obtain a "chemical Langevin equation", which can be naturally interpreted as a stochastic neural mass model. The Langevin equation is computationally inexpensive to simulate and enables a thorough study of metastable dynamics in classical setups (population spikes and Up-Down-states dynamics) by means of phase-plane analysis. An extension of the Langevin equation for small network sizes is also presented. The stochastic neural mass model constitutes the basic component of our mesoscopic model for replay. We show that the mesoscopic model faithfully captures the statistical structure of individual replayed trajectories in microscopic simulations and in previously reported experimental data. Moreover, compared to the deterministic Romani-Tsodyks model of place-cell dynamics, it exhibits a higher level of variability regarding order, direction and timing of replayed trajectories, which seems biologically more plausible and could be functionally desirable. This variability is the product of a new dynamical regime where metastability emerges from a complex interplay between finite-size fluctuations and local fatigue.

Author summary

Cortical and hippocampal areas of rodents and monkeys often exhibit neural activities that are best described by sequences of re-occurring firing-rate patterns, so-called metastable states. Metastable neural population dynamics has been implicated in important sensory and cognitive functions such as neural coding, attention, expectation and decision-making. An intriguing example is hippocampal replay, i.e. short activity waves across place cells during sleep or rest which represent previous animal trajectories and are thought to be critical for memory consolidation. However, a mechanistic understanding of metastable dynamics in terms of neural circuit parameters such as network size and synaptic properties is largely missing. We derive a simple stochastic population model at the mesoscopic scale from an underlying biological neural network with dynamic synapses at the microscopic scale. This "bottom-up" derivation provides a unique link between emergent population dynamics and neural circuit parameters, thus enabling a systematic analysis of how metastability depends on neuron numbers as well as neuronal and synaptic parameters. Using the mesoscopic model, we discover a novel dynamical regime, where replay events are triggered by fluctuations in finite-size neural networks. This fluctuation-driven regime predicts a high level of variability in the occurrence of replay events that could be tested experimentally.

Details