Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Personalized Privacy-Preserving Distributed Learning on Heterogeneous Data
 
conference paper

Personalized Privacy-Preserving Distributed Learning on Heterogeneous Data

Pradeep, Aditya  
•
Gastpar, Michael C.  
2023
Proceedings of the 2023 IEEE International Symposium on Information Theory (ISIT)
2023 IEEE International Symposium on Information Theory (ISIT)

One major challenge in distributed learning is to efficiently learn for each client when the data across clients is heterogeneous or non iid (not independent or identically distributed). This provides a significant challenge as the data of the other clients may not be helpful to each individual client. Thus the following question arises - can each individual client’s performance be improved with access to the data of other clients in this heterogeneous data setting? A further challenge is to have a good personalized model while still maintaining the privacy of local data samples.We consider a model where the client data distributions are not identical and can be dependent. In this heterogeneous data setting we study the problem of distributed learning of data distributions. We propose a personalized linear estimator for each client and show that this estimator is never worse and can be substantially better (up to a factor equal to the number of clients) than the sample mean estimator while still concentrating around the true probability. This estimator can be implemented by privacy-preserving schemes in both the cryptographic and differentially private settings.

  • Details
  • Metrics
Type
conference paper
DOI
10.1109/ISIT54713.2023.10206907
Author(s)
Pradeep, Aditya  
Gastpar, Michael C.  
Date Issued

2023

Published in
Proceedings of the 2023 IEEE International Symposium on Information Theory (ISIT)
Start page

2272

End page

2277

Subjects

Distributed learning

•

Heterogeneous data

•

Federated learning

Note

The work in this paper was supported in part by the Swiss National Science Foundation under Grant 200364.

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LINX  
Event nameEvent placeEvent date
2023 IEEE International Symposium on Information Theory (ISIT)

Taipei, Taiwan

June 25-30, 2023

Available on Infoscience
September 22, 2023
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/200919
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés