Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Universality laws for Gaussian mixtures in generalized linear models
 
conference paper

Universality laws for Gaussian mixtures in generalized linear models

Dandi, Yatin  
•
Stephan, Ludovic  
•
Krzakala, Florent  
Show more
December 11, 2024
NeurIPS Proceedings
Advances in Neural Information Processing Systems 36 (NeurIPS 2023)

A recent line of work in high-dimensional statistics working under the Gaussian mixture hypothesis has led to a number of results in the context of empirical risk minimization, Bayesian uncertainty quantification, separation of kernel methods and neural networks, ensembling and fluctuation of random features. We provide rigorous proofs for the applicability of these results to a general class of datasets (xi,yi,i=1,…,n) containing independent samples from a mixture distribution ∑c∈CρcPcx. Specifically, we consider the hypothesis class of generalized linear models y^=F(Θ⊤x) and investigate the asymptotic joint statistics of a family of generalized linear estimators (Θ(1),…,Θ(M)), obtained either from (a) minimizing an empirical risk Rn^(m)(Θ(m);X,y) or (b) sampling from the associated Gibbs measure exp⁡(−βnRn^(m)(Θ(m);X,y)). Our main contribution is to characterize under which conditions the asymptotic joint statistics of this family depends (on a weak sense) only on the means and covariances of the class conditional features distribution Pcx. This allows us to prove the universality of different quantities of interest, including training, generalization errors, as well as the geometrical properties and correlations of the estimators.

  • Files
  • Details
  • Metrics
Type
conference paper
Author(s)
Dandi, Yatin  

EPFL

Stephan, Ludovic  

EPFL

Krzakala, Florent  

EPFL

Loureiro, Bruno  

École Normale Supérieure - PSL

Zdeborová, Lenka  

EPFL

Date Issued

2024-12-11

Published in
NeurIPS Proceedings
Subjects

theoretical analysis

•

high-dimensional statistics

•

Universality

•

weak convergence

•

mixture models

•

sampling

•

statistical physics

URL

View in NeurIPS Proceedings

https://proceedings.neurips.cc/paper_files/paper/2023/hash/abccb8a90b30d45b948360ba41f5a20f-Abstract-Conference.html
Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
SPOC1  
IDEPHICS1  
Event nameEvent acronymEvent placeEvent date
Advances in Neural Information Processing Systems 36 (NeurIPS 2023)

NeurIPS 2023

New Orleans Convention Center USA

2023-12-11

Available on Infoscience
July 24, 2024
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/240444
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés