Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Entropy and mutual information in models of deep neural networks
 
conference paper

Entropy and mutual information in models of deep neural networks

Gabrie, Marylou
•
Manoel, Andre
•
Luneau, Clement  
Show more
January 1, 2018
Advances In Neural Information Processing Systems 31 (Nips 2018)
32nd Conference on Neural Information Processing Systems (NIPS)

We examine a class of stochastic deep learning models with a tractable method to compute information-theoretic quantities. Our contributions are three-fold: (i) We show how entropies and mutual informations can be derived from heuristic statistical physics methods, under the assumption that weight matrices are independent and orthogonally-invariant. (ii) We extend particular cases in which this result is known to be rigorously exact by providing a proof for two-layers networks with Gaussian random weights, using the recently introduced adaptive interpolation method. (iii) We propose an experiment framework with generative models of synthetic datasets, on which we train deep neural networks with a weight constraint designed so that the assumption in (i) is verified during learning. We study the behavior of entropies and mutual informations throughout learning and conclude that, in the proposed setting, the relationship between compression and generalization remains elusive.

  • Details
  • Metrics
Type
conference paper
Web of Science ID

WOS:000461823301078

Author(s)
Gabrie, Marylou
Manoel, Andre
Luneau, Clement  
Barbier, Jean  
Macris, Nicolas  
Krzakala, Florent
Zdeborova, Lenka
Date Issued

2018-01-01

Publisher

NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)

Publisher place

La Jolla

Published in
Advances In Neural Information Processing Systems 31 (Nips 2018)
Series title/Series vol.

Advances in Neural Information Processing Systems

Volume

31

Subjects

Computer Science, Artificial Intelligence

•

Computer Science

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LTHC  
Event nameEvent placeEvent date
32nd Conference on Neural Information Processing Systems (NIPS)

Montreal, CANADA

Dec 02-08, 2018

Available on Infoscience
June 18, 2019
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/158048
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés