Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. On the Double Descent of Random Features Models Trained with SGD
 
conference paper

On the Double Descent of Random Features Models Trained with SGD

Liu, Fanghui  
•
Suykens, A.K. Johan
•
Cevher, Volkan  orcid-logo
2022
[Proceedings of NeurIPS 2022]
36th Conference on Neural Information Processing Systems (NeurIPS 2022)

We study generalization properties of random features (RF) regression in high dimensions optimized by stochastic gradient descent (SGD) in under-/overparameterized regime. In this work, we derive precise non-asymptotic error bounds of RF regression under both constant and polynomial-decay step-size SGD setting, and observe the double descent phenomenon both theoretically and empirically. Our analysis shows how to cope with multiple randomness sources of initialization, label noise, and data sampling (as well as stochastic gradients) with no closedform solution, and also goes beyond the commonly-used Gaussian/spherical data assumption. Our theoretical results demonstrate that, with SGD training, RF regression still generalizes well for interpolation learning, and is able to characterize the double descent behavior by the unimodality of variance and monotonic decrease of bias. Besides, we also prove that the constant step-size SGD setting incurs no loss in convergence rate when compared to the exact minimum-norm interpolator, as a theoretical justification of using SGD in practice.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

Fanghui_RFF.pdf

Type

Postprint

Version

Accepted version

Access type

openaccess

License Condition

copyright

Size

855.85 KB

Format

Adobe PDF

Checksum (MD5)

2834b1d9a7eb925dbbe8e408352e6e27

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés