Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. A prediction rigidity formalism for low-cost uncertainties in trained neural networks
 
research article

A prediction rigidity formalism for low-cost uncertainties in trained neural networks

Bigi, Filippo  
•
Chong, Sanggyu  
•
Ceriotti, Michele  
Show more
December 1, 2024
Machine Learning: Science and Technology

Quantifying the uncertainty of regression models is essential to ensure their reliability, particularly since their application often extends beyond their training domain. Based on the solution of a constrained optimization problem, this work proposes ‘prediction rigidities’ as a formalism to obtain uncertainties of arbitrary pre-trained regressors. A clear connection between the suggested framework and Bayesian inference is established, and a last-layer approximation is developed and rigorously justified to enable the application of the method to neural networks. This extension affords cheap uncertainties without any modification to the neural network itself or its training procedure. The effectiveness of this approach is shown for a wide range of regression tasks, ranging from simple toy models to applications in chemistry and meteorology.

  • Files
  • Details
  • Metrics
Type
research article
DOI
10.1088/2632-2153/ad805f
Scopus ID

2-s2.0-85207662943

Author(s)
Bigi, Filippo  

École Polytechnique Fédérale de Lausanne

Chong, Sanggyu  

École Polytechnique Fédérale de Lausanne

Ceriotti, Michele  

École Polytechnique Fédérale de Lausanne

Grasselli, Federico  

École Polytechnique Fédérale de Lausanne

Date Issued

2024-12-01

Published in
Machine Learning: Science and Technology
Volume

5

Issue

4

Article Number

045018

Subjects

low-cost uncertainties

•

neural network

•

pre-trained

•

predictions

•

regression

•

rigidity

•

uncertainty quantification

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
COSMO  
FunderFunding(s)Grant NumberGrant URL

NCCR MARVEL

European Research Council

Swiss National Science Foundation

182892

Show more
Available on Infoscience
January 25, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/244217
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés