Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. EPFL thesis
  4. Physical Neural Networks with Waves
 
doctoral thesis

Physical Neural Networks with Waves

Momeni, Ali  
2025

As Moore's Law approaches its physical limits and conventional scaling trends stagnate, the development of alternative computing paradigms has become increasingly imperative. Physical neural networks (PNNs) leverage analog physical systems to perform neural-like computations and present a promising pathway toward low-power artificial intelligence (AI) computational paradigms. PNNs, such as those based on optical platforms, offer compelling advantages in power efficiency and scalability compared to conventional electronic hardware. Although currently confined to small-scale laboratory demonstrations, PNNs have the potential to dramatically expand the capabilities of AI, enabling significantly larger models and facilitating energy-efficient, local, and private inference on edge devices. Achieving this transformative potential requires rethinking AI models and their training strategies within the constraints imposed by the underlying hardware physics. This thesis begins by exploring the potential of wave-based systems for performing spatially parallel linear operations, leveraging principles such as Greenâ  s function engineering through carefully designed metasurfaces and metagratings to implement analog computing primitives. Building on these capabilities, it demonstrates how wave-based platformsâ such as time-Floquet modulated media and nonlinear acoustic metamaterials â can realize neural architectures like Extreme Learning Machines (ELM) and Reservoir Computing (RC) for in-sensor processing. To further improve the energy efficiency and scalability of PNNs, the concept of structural nonlinearity is proposed. This approach encodes input data directly into tunable physical parameters of otherwise linear systems, enabling full nonlinear neuromorphic computing with linear systems, eliminating the need for discrete nonlinear components, and facilitating scalable analog learning. To address the challenge of training in PNNs, we first explore a wide range of training methods adapted to analog systems. Particular attention is given to the development of an online training framework based on locality-aware surrogate models, which leverages a novel loss function, GradPIE, in black-box settings. Next, the Physical Local Learning (PhyLL) scheme is proposed, enabling efficient supervised and unsupervised training of deep physical neural networks without requiring detailed characterization of the nonlinear physical substrate. The universality of the proposed method is demonstrated using three distinct wave-based systems, each characterized by unique underlying wave phenomena and nonlinearities. The first example features a chaotic acoustic cavity implemented with nonlinear scatterers, the second involves a chaotic microwave cavity, and the third showcases a modeled optical multimodal fiber with readout nonlinearity. In conclusion, this thesis introduces foundational architectures and training methodologies that advance the field of analog AI computing. The proposed PhyLL framework enables robust, hardware-native learning that is resilient to perturbations and physical imperfections, while the concept of structural nonlinearity facilitates fully nonlinear processing through reconfigurable linear media. Collectively, these contributions lay the foundation for scalable, energy-efficient, and adaptive physical neural networks, paving the way toward future fully analog AI systems capable of real-time, private, and low-power

  • Details
  • Metrics
Type
doctoral thesis
DOI
10.5075/epfl-thesis-10877
Author(s)
Momeni, Ali  

École Polytechnique Fédérale de Lausanne

Advisors
Fleury, Romain Christophe Rémy  
Jury

Prof. Mahsa Shoaran (présidente) ; Prof. Romain Christophe Rémy Fleury (directeur de thèse) ; Prof. Christophe Moser, Prof. Nader Engheta, Prof. Sylvain Gigan (rapporteurs)

Date Issued

2025

Publisher

EPFL

Publisher place

Lausanne

Public defense year

2025-10-31

Thesis number

10877

Total of pages

198

Subjects

Physical neural networks

•

optical neural networks

•

unconventional computing

•

neuromorphic computing

•

machine learning

•

deep learning

•

learning algorithms.

EPFL units
LWE  
Faculty
STI  
School
IEL  
Doctoral School
EDEE  
Available on Infoscience
October 20, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/255120
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés