Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron
 
conference paper

Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron

Abbe, Emmanuel  
•
Li, Shuangping
•
Sly, Allan
January 1, 2022
2021 Ieee 62Nd Annual Symposium On Foundations Of Computer Science (Focs 2021)
62nd IEEE Annual Symposium on Foundations of Computer Science (FOCS)

YY We consider the symmetric binary perceptron model, a simple model of neural networks that has gathered significant attention in the statistical physics, information theory and probability theory communities, with recent connections made to the performance of learning algorithms in Baldassi et al. '15.

We establish that the partition function of this model, normalized by its expected value, converges to a lognormal distribution. As a consequence, this allows us to establish several conjectures for this model: (i) it proves the contiguity conjecture of Aubin et al. '19 between the planted and unplanted models in the satisfiable regime; (ii) it establishes the sharp threshold conjecture; (iii) it proves the frozen 1-RSB conjecture in the symmetric case, conjectured first by Krauth-Mezard '89 in the asymmetric case.

In a recent work of Perkins-Xu '21, the last two conjectures were also established by proving that the partition function concentrates on an exponential scale, under an analytical assumption on a real-valued function. This left open the contiguity conjecture and the lognormal limit characterization, which are established here unconditionally, with the analytical assumption verified. In particular, our proof technique relies on a dense counterpart of the small graph conditioning method, which was developed for sparse models in the celebrated work of Robinson and Wormald.

  • Details
  • Metrics
Type
conference paper
DOI
10.1109/FOCS52979.2021.00041
Web of Science ID

WOS:000802209600031

Author(s)
Abbe, Emmanuel  
Li, Shuangping
Sly, Allan
Date Issued

2022-01-01

Publisher

IEEE COMPUTER SOC

Publisher place

Los Alamitos

Published in
2021 Ieee 62Nd Annual Symposium On Foundations Of Computer Science (Focs 2021)
ISBN of the book

978-1-6654-2055-6

Series title/Series vol.

Annual IEEE Symposium on Foundations of Computer Science

Start page

327

End page

338

Subjects

Computer Science, Theory & Methods

•

Computer Science

•

perceptron model

•

neural networks

•

interpolation

•

partition function

•

sharp thresholds

•

solution space

•

freezing

•

networks

•

states

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

Event nameEvent placeEvent date
62nd IEEE Annual Symposium on Foundations of Computer Science (FOCS)

ELECTR NETWORK

Feb 07-10, 2022

Available on Infoscience
July 4, 2022
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/188858
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés