Structure and Computation in Disordered Systems and Neural Networks
This thesis investigates structure and computation in high-dimensional complex systems, with a focus on dynamical processes on graphs and learning in neural networks. Both settings involve many interacting variables coupled through disorder, that give rise to rich emergent phenomena. A central objective is to understand how microscopic interactions give rise to macroscopic behavior, and how phase transitions delineate qualitatively distinct regimes. In the first part, we extend methods from statistical physics to study dynamical processes on graphs. We develop the backtracking dynamical cavity method, which characterizes attractors by tracing dynamics backward from final states. This approach gives new insights into quenches in spin glasses and dynamical phase transitions in cellular automata and opinion dynamics on random graphs. In the second part, we turn to neural networks and examine how learning dynamics shape internal representations and mechanisms. We focus on sequence models and analyse them using solvable toy models, controlled experiments, and interpretability tools. We identify a phase transition between positional and semantic learning in attention models, characterize distinct counting strategies in small transformers, and show soft labels can reveal information about held-out samples in dataset distillation. With these insights, this thesis contributes new analytical methods for graph dynamics and theoretical insights into neural network internals.