Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. EPFL thesis
  4. A theory of memory consolidation and synaptic pruning in cortical circuits
 
doctoral thesis

A theory of memory consolidation and synaptic pruning in cortical circuits

Iatropoulos, Georgios  
2023

Over the course of a lifetime, the human brain acquires an astonishing amount of semantic knowledge and autobiographical memories, often with an imprinting strong enough to allow detailed information to be recalled many years after the initial learning experience took place. The formation of such long-lasting memories is known to primarily involve cortex, where it is accompanied by a wave of synaptic growth, pruning, and fine-tuning that stretches across several nights of sleep. This process, broadly referred to as consolidation, gradually stabilizes labile information and moves it into permanent storage. It has a profound impact on connectivity and cognitive function, especially during development. Though extensively studied in terms of behavior and neuroanatomy, it is still unclear how this interplay between structural adaptation and long-term memory consolidation can be explained from a theoretical and computational perspective.

In this thesis, we take a top-down approach to develop a mathematical model of consolidation and pruning within the context of recurrent neural networks, by combining recent techniques from the fields of optimization, machine learning, and statistics. The first part of the thesis treats the problem of maximally noise-robust memory without synaptic resource constraints. Using kernel methods, we derive a compact description of networks with optimal weight configuration. This unifies many of the classical memory models under a common mathematical framework, and formalizes the relationship between active dendritic processing on the single-neuron level, and the storage capacity of the circuit as a whole.

In the second part of the thesis, we treat the problem of maximal memory robustness under conditions of sparse connectivity. We combine our unconstrained model with an implicit regularization, by endowing the network with bi- and tri-partite synapses, instead of the usual scalar weights. This allows us to derive a simple synaptic learning rule that simultaneously consolidates memories and prunes weights, while incorporating memory replay, multiplicative homeostatic scaling, and weight-dependent plasticity. We also use the synapse model to derive scaling properties of intrinsic synaptic noise, which we test in a meta-analysis of experimental data on dendritic spine dynamics.

In the concluding sections, we briefly discuss the implication of our results with regards to current memory-inspired machine learning methods, the function of sleep, and the environmental effects on structural plasticity in development.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

EPFL_TH10094.pdf

Type

N/a

Access type

openaccess

License Condition

copyright

Size

10.76 MB

Format

Adobe PDF

Checksum (MD5)

d970de9bb2be867aa0b5c2b23a7c557d

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés