Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a Polynomial Net Study
 
conference paper not in proceedings

Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a Polynomial Net Study

Wu, Yongtao  
•
Zhu, Zhenyu  
•
Liu, Fanghui  
Show more
2022
36th Conference on Neural Information Processing Systems (NeurIPS)

Neural tangent kernel (NTK) is a powerful tool to analyze training dynamics of neural networks and their generalization bounds. The study on NTK has been devoted to typical neural network architectures, but it is incomplete for neural networks with Hadamard products (NNs-Hp), e.g., StyleGAN and polynomial neural networks (PNNs). In this work, we derive the finite-width NTK formulation for a special class of NNs-Hp, i.e., polynomial neural networks. We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK. Based on our results, we elucidate the separation of PNNs over standard neural networks with respect to extrapolation and spectral bias. Our two key insights are that when compared to standard neural networks, PNNs can fit more complicated functions in the extrapolation regime and admit a slower eigenvalue decay of the respective NTK, leading to a faster learning towards high-frequency functions. Besides, our theoretical results can be extended to other types of NNs-Hp, which expand the scope of our work. Our empirical results validate the separations in broader classes of NNs-Hp, which provide a good justification for a deeper understanding of neural architectures.

  • Files
  • Details
  • Metrics
Type
conference paper not in proceedings
Author(s)
Wu, Yongtao  
Zhu, Zhenyu  
Liu, Fanghui  
Chrysos, Grigorios  
Cevher, Volkan  orcid-logo
Date Issued

2022

Total of pages

36

Subjects

ml-ai

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LIONS  
Event nameEvent placeEvent date
36th Conference on Neural Information Processing Systems (NeurIPS)

New Orleans, USA

November 28 - December 3, 2022

Available on Infoscience
November 28, 2022
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/192790
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés