Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Fast-FedUL: A Training-Free Federated Unlearning with Provable Skew Resilience
 
conference paper

Fast-FedUL: A Training-Free Federated Unlearning with Provable Skew Resilience

Huynh, Thanh Trung  
•
Nguyen, Trong Bang
•
Nguyen, Phi Le
Show more
Bifet, A
•
Davis, J
Show more
January 1, 2024
Machine Learning And Knowledge Discovery In Databases: Research Track, Pt V, Ecml Pkdd 2024
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)

Federated learning (FL) has recently emerged as a compelling machine learning paradigm, prioritizing the protection of privacy for training data. The increasing demand to address issues such as "the right to be forgotten" and combat data poisoning attacks highlights the importance of techniques, known as unlearning, which facilitate the removal of specific training data from trained FL models. Despite numerous unlearning methods proposed for centralized learning, they often prove inapplicable to FL due to fundamental differences in the operation of the two learning paradigms. Consequently, unlearning in FL remains in its early stages, presenting several challenges. Many existing unlearning solutions in FL require a costly retraining process, which can be burdensome for clients. Moreover, these methods are primarily validated through experiments, lacking theoretical assurances. In this study, we introduce Fast-FedUL, a tailored unlearning method for FL, which eliminates the need for retraining entirely. Through meticulous analysis of the target client's influence on the global model in each round, we develop an algorithm to systematically remove the impact of the target client from the trained model. In addition to presenting empirical findings, we offer a theoretical analysis delineating the upper bound of our unlearned model and the exact retrained model (the one obtained through retraining using untargeted clients). Experimental results with backdoor attack scenarios indicate that Fast-FedUL effectively removes almost all traces of the target client (achieving a mere 0.01% success rate in backdoor attacks on the unlearned model), while retaining the knowledge of untargeted clients (obtaining a high accuracy of up to 98% on the main task). Significantly, Fast-FedUL attains the lowest time complexity, providing a speed that is 1000 times faster than retraining.

  • Details
  • Metrics
Type
conference paper
DOI
10.1007/978-3-031-70362-1_4
Web of Science ID

WOS:001317380300004

Author(s)
Huynh, Thanh Trung  
•
Nguyen, Trong Bang
•
Nguyen, Phi Le
•
Nguyen, Tien Thai
•
Weidlich, Matthias
•
Quoc Viet Hung Nguyen
•
Aberer, Karl  
Editors
Bifet, A
•
Davis, J
•
Krilavicius, T
•
Kull, M
•
Ntoutsi, E
•
Zliobaite, I
Date Issued

2024-01-01

Publisher

Springer Nature

Publisher place

CHAM

Published in
Machine Learning And Knowledge Discovery In Databases: Research Track, Pt V, Ecml Pkdd 2024
ISBN of the book

978-3-031-70361-4

978-3-031-70362-1

Series title/Series vol.

Lecture Notes in Artificial Intelligence; 14945

ISSN (of the series)

2945-9133

1611-3349

Start page

55

End page

72

Subjects

Machine Unlearning

•

Federated Learning

•

Skew Resilience

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LSIR  
Event nameEvent acronymEvent placeEvent date
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)

Vilnius, LITHUANIA

2024-09-09 - 2024-09-13

FunderFunding(s)Grant NumberGrant URL

Vingroup Joint Stock Company (Vingroup JSC)

Vingroup Innovation Foundation (VINIF)

VINIF.2021.DA00128

Hanoi University of Science and Technology (HUST)

T2023-PC-028

Show more
Available on Infoscience
January 31, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/246118
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés