Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Optimization and Generalization of Shallow Neural Networks with Quadratic Activation Functions
 
conference paper

Optimization and Generalization of Shallow Neural Networks with Quadratic Activation Functions

Sarao Manelli, Stefano
•
Vanden-Eijnden, Eric
•
Zdeborová, Lenka  
Larochelle, H.
•
Ranzato, M.
Show more
2020
Proceeding of the 2020 Advances in Neural Information Processing Systems
Advances in Neural Information Processing Systems

p>We study the dynamics of optimization and the generalization properties of one-hidden layer neural networks with quadratic activation function in the overparametrized regime where the layer width m is larger than the input dimension d.

We consider a teacher-student scenario where the teacher has the same structure as the student with a hidden layer of smaller width m*<=m.

We describe how the empirical loss landscape is affected by the number n of data samples and the width m* of the teacher network. In particular we determine how the probability that there be no spurious minima on the empirical loss depends on n, d, and m*, thereby establishing conditions under which the neural network can in principle recover the teacher.

We also show that under the same conditions gradient descent dynamics on the empirical loss converges and leads to small generalization error, i.e. it enables recovery in practice.

Finally we characterize the time-convergence rate of gradient descent in the limit of a large number of samples.

These results are confirmed by numerical experiments.

  • Details
  • Metrics
Type
conference paper
Author(s)
Sarao Manelli, Stefano
Vanden-Eijnden, Eric
Zdeborová, Lenka  
Editors
Larochelle, H.
•
Ranzato, M.
•
Hadsell, R.
•
Balcan, M. F.
•
Lin, H.
Date Issued

2020

Publisher

Curran Associates, Inc.

Published in
Proceeding of the 2020 Advances in Neural Information Processing Systems
Series title/Series vol.

Advances in Neural Information Processing Systems; 33

Volume

33

Start page

13445

URL

Paper

https://papers.nips.cc/paper/2020/file/9b8b50fb590c590ffbf1295ce92258dc-Paper.pdf
Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
SPOC1  
SPOC2  
Event nameEvent date
Advances in Neural Information Processing Systems

Dec 6, 2020 – Dec 12, 2020

Available on Infoscience
March 5, 2021
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/175777
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés