Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Tight Differential Privacy Guarantees for the Shuffle Model with <i>k</i>-Randomized Response
 
conference paper

Tight Differential Privacy Guarantees for the Shuffle Model with k-Randomized Response

Biswas, Sayan  
•
Jung, Kangsoo
•
Palamidessi, Catuscia
Sedes, F
•
Tawbi, N
Show more
January 1, 2024
Foundations And Practice Of Security, Pt I, Fps 2023
16th International Symposium on Foundations and Practice of Security (FPS)

Most differentially private algorithms assume a central model in which a reliable third party inserts noise to queries made on datasets, or a local model where the data owners directly perturb their data. However, the central model is vulnerable via a single point of failure, and the local model has the disadvantage that the utility of the data deteriorates significantly. The recently proposed shuffle model is an intermediate framework between the central and local paradigms. In the shuffle model, data owners send their locally privatized data to a server where messages are shuffled randomly, making it impossible to trace the link between a privatized message and the corresponding sender. In this paper, we theoretically derive the tightest known differential privacy guarantee for the shuffle models with k-Randomized Response (k-RR) local randomizers, under histogram queries, and we denoise the histogram produced by the shuffle model using the matrix inversion method to evaluate the utility of the privacy mechanism. We perform experiments on both synthetic and real data to compare the privacy-utility trade-off of the shuffle model with that of the central one privatized by adding the state-of-the-art Gaussian noise to each bin. We see that the difference in statistical utilities between the central and the shuffle models shows that they are almost comparable under the same level of differential privacy protection.

  • Details
  • Metrics
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés