Files

Abstract

While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of resonant chaos, characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks. Author summary Biological neural networks are formed by a large number of neurons whose interactions can be extremely complex. Such systems have been successfully studied using random network models, in which the interactions among neurons are assumed to be random. However, the dynamics of single units are usually described using over-simplified models, which might not capture several salient features of real neurons. Here, we show how accounting for richer single-neuron dynamics results in shaping the network dynamics and determines which signals are better transmitted. We focus on adaptation, an important mechanism present in biological neurons that consists in the decrease of their firing rate in response to a sustained stimulus. Our mean-field approach reveals that the presence of adaptation shifts the network into a previously unreported dynamical regime, that we term resonant chaos, in which chaotic activity has a strong oscillatory component. Moreover, we show that this regime is advantageous for the transmission of low-frequency signals. Our work bridges the microscopic dynamics (single neurons) to the macroscopic dynamics (network), and shows how the global signal-transmission properties of the network can be controlled by acting on the single-neuron dynamics. These results paves the way for further developments that include more complex neural mechanisms, and considerably advance our understanding of realistic neural networks.

Details

Actions

Preview