Two-factor synaptic consolidation reconciles robust memory with pruning and homeostatic scaling
Memory consolidation involves a process of engram reorganization and stabilization that is thought to occur primarily during sleep through a combination of neural replay, homeostatic plasticity, synaptic maturation, and pruning. From a computational perspective, however, this process remains puzzling, as it is unclear how the underlying mechanisms can be incorporated into a common mathematical model of learning and memory. Here, we propose a solution by deriving a consolidation model that uses replay and two-factor synapses to store memories in recurrent neural networks with sparse connectivity and maximal noise robustness. The model offers a unified account of experimental observations of consolidation, such as multiplicative homeostatic scaling, task-driven synaptic pruning, increased neural stimulus selectivity, and preferential strengthening of weak memories. The model further predicts that intrinsic synaptic noise scales sublinearly with synaptic strength; this is supported by a meta-analysis of published synaptic imaging datasets.
2024.07.23.604787v1.full.pdf
Main Document
Submitted version (Preprint)
openaccess
CC BY
7.91 MB
Adobe PDF
10455a2bd84f2a483309a4881581d436