Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Residual-based attention in physics-informed neural networks
 
Loading...
Thumbnail Image
research article

Residual-based attention in physics-informed neural networks

Anagnostopoulos, Sokratis J.
•
Toscano, Juan Diego
•
Stergiopulos, Nikolaos  
Show more
March 1, 2024
Computer Methods In Applied Mechanics And Engineering

Driven by the need for more efficient and seamless integration of physical models and data, physics -informed neural networks (PINNs) have seen a surge of interest in recent years. However, ensuring the reliability of their convergence and accuracy remains a challenge. In this work, we propose an efficient, gradient -less weighting scheme for PINNs that accelerates the convergence of dynamic or static systems. This simple yet effective attention mechanism is a bounded function of the evolving cumulative residuals and aims to make the optimizer aware of problematic regions at no extra computational cost or adversarial learning. We illustrate that this general method consistently achieves one order of magnitude faster convergence than vanilla PINNs and a minimum relative L2 error of O(10-5), on typical benchmarks of the literature. The method is further tested on the inverse solution of the Navier-Stokes within the brain perivascular spaces, where it considerably improves the prediction accuracy. Furthermore, an ablation study is performed for each case to identify the contribution of the components that enhance the vanilla PINN formulation. Evident from the convergence trajectories is the ability of the optimizer to effectively escape from poor local minima or saddle points while focusing on the challenging domain regions, which consistently have a high residual score. We believe that alongside exact boundary conditions and other model reparameterizations, this type of attention mask could be an essential element for fast training of both PINNs and neural operators.

  • Details
  • Metrics
Type
research article
DOI
10.1016/j.cma.2024.116805
Web of Science ID

WOS:001171429600001

Author(s)
Anagnostopoulos, Sokratis J.
•
Toscano, Juan Diego
•
Stergiopulos, Nikolaos  
•
Karniadakis, George Em
Date Issued

2024-03-01

Publisher

Elsevier Science Sa

Published in
Computer Methods In Applied Mechanics And Engineering
Volume

421

Article Number

116805

Subjects

Technology

•

Physical Sciences

•

Residual-Based Attention

•

Pinns Accuracy

•

Adaptive Weights

•

Fast Convergence

Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LHTC  
FunderGrant Number

Swiss National Science Foundation grant "Hemodynamics of Physiological Aging"

205321_197234

DOE SEA-CROGS project

DE-SC0023191

ONR Vannevar Bush Faculty Fellowship

FA9550-20-1-0358

Show more
Available on Infoscience
March 18, 2024
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/206554
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés