Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Initialization Matters: Privacy-Utility Analysis of Overparameterized Neural Networks
 
Loading...
Thumbnail Image
conference paper

Initialization Matters: Privacy-Utility Analysis of Overparameterized Neural Networks

Ye†, Jiayuan
•
Zhu, Zhenyu  
•
Liu, Fanghui  
Show more
2023
37th Annual Conference on Neural Information Processing Systems

We analytically investigate how over-parameterization of models in randomized machine learning algorithms impacts the information leakage about their training data. Specifically, we prove a privacy bound for the KL divergence between model distributions on worst-case neighboring datasets, and explore its dependence on the initialization, width, and depth of fully connected neural networks. We find that this KL privacy bound is largely determined by the expected squared gradient norm relative to model parameters during training. Notably, for the special setting of linearized network, our analysis indicates that the squared gradient norm (and therefore the escalation of privacy loss) is tied directly to the per-layer variance of the initialization distribution. By using this analysis, we demonstrate that privacy bound improves with increasing depth under certain initializations (LeCun and Xavier), while degrades with increasing depth under other initializations (He and NTK). Our work reveals a complex interplay between privacy and depth that depends on the chosen initialization distribution. We further prove excess empirical risk bounds under a fixed KL privacy budget, and show that the interplay between privacy utility trade-off and depth is similarly affected by the initialization.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

Initialization Matters Privacy-Utility Analysis of Overparameterized Neural Networks.pdf

Type

Postprint

Access type

openaccess

License Condition

copyright

Size

541.49 KB

Format

Adobe PDF

Checksum (MD5)

f257bab60168ccc18c057d210057a31b

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés