Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Initialization Matters: Privacy-Utility Analysis of Overparameterized Neural Networks
 
conference paper

Initialization Matters: Privacy-Utility Analysis of Overparameterized Neural Networks

Ye†, Jiayuan
•
Zhu, Zhenyu  
•
Liu, Fanghui  
Show more
2023
37th Annual Conference on Neural Information Processing Systems

We analytically investigate how over-parameterization of models in randomized machine learning algorithms impacts the information leakage about their training data. Specifically, we prove a privacy bound for the KL divergence between model distributions on worst-case neighboring datasets, and explore its dependence on the initialization, width, and depth of fully connected neural networks. We find that this KL privacy bound is largely determined by the expected squared gradient norm relative to model parameters during training. Notably, for the special setting of linearized network, our analysis indicates that the squared gradient norm (and therefore the escalation of privacy loss) is tied directly to the per-layer variance of the initialization distribution. By using this analysis, we demonstrate that privacy bound improves with increasing depth under certain initializations (LeCun and Xavier), while degrades with increasing depth under other initializations (He and NTK). Our work reveals a complex interplay between privacy and depth that depends on the chosen initialization distribution. We further prove excess empirical risk bounds under a fixed KL privacy budget, and show that the interplay between privacy utility trade-off and depth is similarly affected by the initialization.

  • Files
  • Details
  • Metrics
Type
conference paper
Author(s)
Ye†, Jiayuan
Zhu, Zhenyu  
Liu, Fanghui  
Shokri, Reza  
Cevher, Volkan  orcid-logo
Date Issued

2023

Subjects

ML-AI

URL

Full paper on the NeurIPS website

https://neurips.cc/virtual/2023/poster/72128
Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LIONS  
Event nameEvent placeEvent date
37th Annual Conference on Neural Information Processing Systems

New Orleans, USA

December 10-16. 2023

Available on Infoscience
March 14, 2024
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/206108
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés