Biazzo, IndacoWu, DianCarleo, Giuseppe2024-07-032024-07-032024-07-032024-06-0110.1088/2632-2153/ad5783https://infoscience.epfl.ch/handle/20.500.14299/209145WOS:001251404900001Efficient sampling and approximation of Boltzmann distributions involving large sets of binary variables, or spins, are pivotal in diverse scientific fields even beyond physics. Recent advances in generative neural networks have significantly impacted this domain. However, these neural networks are often treated as black boxes, with architectures primarily influenced by data-driven problems in computational science. Addressing this gap, we introduce a novel autoregressive neural network architecture named TwoBo, specifically designed for sparse two-body interacting spin systems. We directly incorporate the Boltzmann distribution into its architecture and parameters, resulting in enhanced convergence speed, superior free energy accuracy, and reduced trainable parameters. We perform numerical experiments on disordered, frustrated systems with more than 1000 spins on grids and random graphs, and demonstrate its advantages compared to previous autoregressive and recurrent architectures. Our findings validate a physically informed approach and suggest potential extensions to multivalued variables and many-body interaction systems, paving the way for broader applications in scientific research.TechnologySparseSpinAutoregressive Neural NetworkNeural NetworkStatistical PhysicsSpin GlassComplex SystemSparse autoregressive neural networks for classical spin systemstext::journal::journal article::research article