Momeni, AliRahmani, BabakMalléjac, Matthieudel Hougne, PhilippFleury, Romain2023-12-152023-12-152023-12-15202310.1126/science.adi8474https://infoscience.epfl.ch/handle/20.500.14299/202678Recent successes in deep learning for vision and natural language processing are attributed to larger models but come with energy consumption and scalability issues. Current training of digital deep-learning models primarily relies on backpropagation that is unsuitable for physical implementation. In this work, we propose a simple deep neural network architecture augmented by a physical local learning (PhyLL) algorithm, which enables supervised and unsupervised training of deep physical neural networks without detailed knowledge of the nonlinear physical layer’s properties. We trained diverse wave-based physical neural networks in vowel and image classification experiments, showcasing the universality of our approach. Our method shows advantages over other hardware-aware training schemes by improving training speed, enhancing robustness, and reducing power consumption by eliminating the need for system modeling and thus decreasing digital computation.Backpropagation-free training of deep physical neural networkstext::journal::journal article::research article