Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Lipschitz constant estimation for Neural Networks via sparse polynomial optimization
 
conference paper not in proceedings

Lipschitz constant estimation for Neural Networks via sparse polynomial optimization

Latorre, Fabian  
•
Rolland, Paul Thierry Yves  
•
Cevher, Volkan  orcid-logo
April 26, 2020
8th International Conference on Learning Representations

We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bounds on the Lipschitz constant of neural networks. The underlying optimization problems boil down to either linear (LP) or semidefinite (SDP) programming. We show how to use the sparse connectivity of a network, to significantly reduce the complexity of computation. This is specially useful for convolutional as well as pruned neural networks. We conduct experiments on networks with random weights as well as networks trained on MNIST, showing that in the particular case of the `1-Lipschitz constant, our approach yields superior estimates, compared to baselines available in the literature.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

lips_poly.pdf

Type

Postprint

Version

Accepted version

Access type

openaccess

License Condition

Copyright

Size

627.79 KB

Format

Adobe PDF

Checksum (MD5)

30711ec269af0c1dde6cbf2a56f1ba75

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés