Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Byzantine-Resilient Learning Beyond Gradients: Distributing Evolutionary Search
 
conference paper

Byzantine-Resilient Learning Beyond Gradients: Distributing Evolutionary Search

Kucharavy, Andrei  
•
Monti, Matteo  
•
Guerraoui, Rachid
Show more
January 1, 2023
Proceedings Of The 2023 Genetic And Evolutionary Computation Conference Companion, Gecco 2023 Companion
Genetic and Evolutionary Computation Conference (GECCO)

Modern machine learning (ML) models are capable of impressive performances. However, their prowess is not due only to the improvements in their architecture and training algorithms but also to a drastic increase in computational power used to train them.|Such a drastic increase led to a growing interest in distributed ML, which in turn made worker failures and adversarial attacks an increasingly pressing concern. While distributed byzantine resilient algorithms have been proposed in a differentiable setting, none exist in a gradient-free setting.|The goal of this work is to address this shortcoming. For that, we introduce a more general definition of byzantine-resilience in ML - the model-consensus, that extends the definition of the classical distributed consensus. We then leverage this definition to show that a general class of gradient-free ML algorithms - (1,lambda)-Evolutionary Search - can be combined with classical distributed consensus algorithms to generate gradient-free byzantine-resilient distributed learning algorithms. We provide proofs and pseudo-code for two specific cases - the Total Order Broadcast and proof-of-work leader election.|To our knowledge, this is the first time a byzantine resilience in gradient-free ML was defined, and algorithms to achieve it - were proposed.

  • Details
  • Metrics
Type
conference paper
DOI
10.1145/3583133.3590719
Web of Science ID

WOS:001117972600096

Author(s)
Kucharavy, Andrei  
Monti, Matteo  
Guerraoui, Rachid
Dolamic, Ljiljana
Corporate authors
ACM
Date Issued

2023-01-01

Publisher

Assoc Computing Machinery

Publisher place

New York

Published in
Proceedings Of The 2023 Genetic And Evolutionary Computation Conference Companion, Gecco 2023 Companion
ISBN of the book

979-8-4007-0120-7

Start page

295

End page

298

Subjects

Technology

•

Evolutionary Search

•

Gradient-Free Optimization

•

Distributed Machine Learning

•

Byzantine Fault Tolerance

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
DCL  
Event nameEvent placeEvent date
Genetic and Evolutionary Computation Conference (GECCO)

Lisbon, PORTUGAL

JUL 15-19, 2023

FunderGrant Number

Cyber-Defence Campus, armasuisse W+T, VBS

ARAMIS CYD-F-2021004

Available on Infoscience
March 18, 2024
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/206289
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés