Two-factor synaptic consolidation reconciles robustness with pruning and homeostatic scaling
Memory consolidation refers to a process of engram reorganization and stabilization that is thought to occur primarily during sleep through a combination of neural replay, homeostatic plasticity, synaptic maturation, and pruning. From a computational perspective, however, this process remains puzzling, as it is unclear how to incorporate the underlying mechanisms into a common mathematical model of learning and memory. Here, we propose a solution by deriving a self-supervised consolidation model that uses replay and two-factor synapses to encode memories in neural networks in a way that maximizes the robustness of cued recall with respect to intrinsic synaptic noise. We show that the dynamics of this optimization make the connectivity sparse and offer a unified account of several experimentally observed signs of consolidation, such as multiplicative homeostatic scaling, task-driven synaptic pruning, increased neural stimulus selectivity, and preferential strengthening of weak memories. The model also reproduces developmental trends in connectivity and stimulus selectivity better than previous models. Finally, it predicts that intrinsic synaptic noise fluctuations should scale sublinearly with synaptic strength; we find support for this in a meta-analysis of published synaptic imaging datasets.
iatropoulos-et-al-2025-two-factor-synaptic-consolidation-reconciles-robustness-with-pruning-and-homeostatic-scaling.pdf
Main Document
Published version
openaccess
CC BY
21.51 MB
Adobe PDF
df15a7694b0b0495f1d3063f65c2cf7c