Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. EPFL thesis
  4. Advancing understanding and practical performance of machine learning interatomic potentials
 
doctoral thesis

Advancing understanding and practical performance of machine learning interatomic potentials

Pozdnyakov, Sergey  
2025

Atomistic machine learning has revolutionized atomistic modeling. Traditionally, molecular dynamics simulations relied on classical force fields or electronic structure calculations like Density Functional Theory. Classical force fields are fast but less precise; electronic structure calculations are accurate but expensive. Machine learning interatomic potentials (MLIPs) combine the best of both, preserving the accuracy of electronic methods while offering linear scaling akin to classical force fields. A crucial property of any machine learning methodology is completeness - the ability to approximate any reasonable function arbitrarily closely by increasing the model size. We begin by finding that whole families of MLIPs are incomplete. The first, most prevalent, consists of methods relying on three-body correlations (e.g., Gaussian Approximation Potentials and Behler-Parrinello Neural Networks). The second is represented by distance-based Graph Neural Networks like SchnetPack and Crystal Graph Convolutional Neural Networks. We present several methods to address this issue. While using k-body correlations with large k can achieve completeness, it is computationally expensive. We propose a refined iterative procedure that efficiently computes only the most relevant linear combinations of such descriptors. We also discuss an alternative approach that achieves efficiency by directly computing kernels between atomic environments. Finally, we show that completeness can be achieved with finite body order by employing representations associated with triplets of atoms. The incompleteness issue is directly related to preserving symmetries in MLIPs. Constructing an efficient and systematically improvable methodology is trivial without rotational invariance constraints. We introduce a general method that symmetrizes any machine learning model a posteriori, allowing non-invariant architectures in materials modeling. This method not only solves the incompleteness problem but also lifts the symmetry barrier, enabling adaptation of 3D point cloud models from other domains to atomistic simulations. By introducing the Point Edge Transformer (PET), we prove that non-equivariant models fitted with rotational augmentations can outperform rigorously equivariant ones. Our model has several desirable properties difficult to achieve within an equivariant framework and achieves state-of-the-art performance on multiple atomistic machine learning benchmarks. Finally, we explore applying the unsymmetrized PET in molecular dynamics to avoid the computational overhead of symmetrization. Although not strictly rotationally invariant, the unsymmetrized PET approximates this property closely during training. The resulting rotational discrepancies are minimal compared to prediction errors, making them negligible in practice. Our analysis confirms that MD simulations using unsymmetrized PET show no significant differences in dynamic observables compared to symmetrized versions. This highlights the practicality and efficiency of using unconstrained models like PET in atomistic simulations. In summary, this thesis presents theoretical advances enhancing our understanding of MLIPs and introduces viable models. The efficiency, stability, and reliability of PET in molecular dynamics highlight the promise of unconstrained models, suggesting a new avenue for research in atomistic machine learning.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

EPFL_TH10703.pdf

Type

Main Document

Version

http://purl.org/coar/version/c_be7fb7dd8ff6fe43

Access type

openaccess

License Condition

N/A

Size

5.27 MB

Format

Adobe PDF

Checksum (MD5)

b803b62c141aaccf103a84812105d0e1

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés