Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. EPFL thesis
  4. Taming neuronal noise with large networks
 
doctoral thesis

Taming neuronal noise with large networks

Schmutz, Valentin Marc  
2022

How does reliable computation emerge from networks of noisy neurons? While individual neurons are intrinsically noisy, the collective dynamics of populations of neurons taken as a whole can be almost deterministic, supporting the hypothesis that, in the brain, computation takes place at the level of neuronal populations.

Mathematical models of networks of noisy spiking neurons allow us to study the effects of neuronal noise on the dynamics of large networks. Classical mean-field models, i.e., models where all neurons are identical and where each neuron receives the average spike activity of the other neurons, offer toy examples where neuronal noise is absorbed in large networks, that is, large networks behave like deterministic systems. In particular, the dynamics of these large networks can be described by deterministic neuronal population equations.

In this thesis, I first generalize classical mean-field limit proofs to a broad class of spiking neuron models that can exhibit spike-frequency adaptation and short-term synaptic plasticity, in addition to refractoriness. The mean-field limit can be exactly described by a multidimensional partial differential equation; the long time behavior of which can be rigorously studied using deterministic methods.

Then, we show that there is a conceptual link between mean-field models for networks of spiking neurons and latent variable models used for the analysis of multi-neuronal recordings. More specifically, we use a recently proposed finite-size neuronal population equation, which we first mathematically clarify, to design a tractable Expectation-Maximization-type algorithm capable of inferring the latent population activities of multi-population spiking neural networks from the spike activity of a few visible neurons only, illustrating the idea that latent variable models can be seen as partially observed mean-field models.

In classical mean-field models, neurons in large networks behave like independent, identically distributed processes driven by the average population activity -- a deterministic quantity, by the law of large numbers. The fact the neurons are identically distributed processes implies a form of redundancy that has not been observed in the cortex and which seems biologically implausible. To show, numerically, that the redundancy present in classical mean-field models is unnecessary for neuronal noise absorption in large networks, I construct a disordered network model where networks of spiking neurons behave like deterministic rate networks, despite the absence of redundancy.

This last result suggests that the concentration of measure phenomenon, which generalizes the ``law of large numbers'' of classical mean-field models, might be an instrumental principle for understanding the emergence of noise-robust population dynamics in large networks of noisy neurons.

  • Files
  • Details
  • Metrics
Type
doctoral thesis
DOI
10.5075/epfl-thesis-9739
Author(s)
Schmutz, Valentin Marc  
Advisors
Gerstner, Wulfram  
•
Löcherbach, Eva  
Jury

Prof. Ralf Schneggenburger (président) ; Prof. Wulfram Gerstner, Eva Löcherbach (directeurs) ; Prof. Juhan Aru, Prof. Nicolas Brunel, Prof. José Carrillo (rapporteurs)

Date Issued

2022

Publisher

EPFL

Publisher place

Lausanne

Public defense year

2022-12-19

Thesis number

9739

Total of pages

179

Subjects

spiking neurons

•

nonlinear Hawkes processes

•

mean-field approximations

•

spike-frequency adaptation

•

short-term synaptic plasticity

•

nonlocal transport equation

•

finite-size fluctuations

•

latent variable model

•

disordered systems

•

concentration of measure

EPFL units
LCN2  
Faculty
SV  
School
BMI  
Doctoral School
EDNE  
Available on Infoscience
December 19, 2022
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/193420
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés