Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Consensus Control for Decentralized Deep Learning
 
conference paper

Consensus Control for Decentralized Deep Learning

Kong, Lingjing
•
Lin, Tao  
•
Koloskova, Anastasia
Show more
January 1, 2021
International Conference On Machine Learning, Vol 139
International Conference on Machine Learning (ICML)

Decentralized training of deep learning models enables on-device learning over networks, as well as efficient scaling to large compute clusters. Experiments in earlier works reveal that, even in a data-center setup, decentralized training often suffers from the degradation in the quality of the model: the training and test performance of models trained in a decentralized fashion is in general worse than that of models trained in a centralized fashion, and this performance drop is impacted by parameters such as network size, communication topology and data partitioning.

We identify the changing consensus distance between devices as a key parameter to explain the gap between centralized and decentralized training. We show in theory that when the training consensus distance is lower than a critical quantity, decentralized training converges as fast as the centralized counterpart. We empirically validate that the relation between generalization performance and consensus distance is consistent with this theoretical observation. Our empirical insights allow the principled design of better decentralized training schemes that mitigate the performance drop. To this end, we provide practical training guidelines and exemplify its effectiveness on the data-center setup as the important first step.

  • Details
  • Metrics
Type
conference paper
Web of Science ID

WOS:000683104605065

Author(s)
Kong, Lingjing
Lin, Tao  
Koloskova, Anastasia
Jaggi, Martin  
Stich, Sebastian U.
Date Issued

2021-01-01

Publisher

JMLR-JOURNAL MACHINE LEARNING RESEARCH

Publisher place

San Diego

Published in
International Conference On Machine Learning, Vol 139
Series title/Series vol.

Proceedings of Machine Learning Research

Volume

139

Subjects

distributed online prediction

•

optimization

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
MLO  
Event nameEvent placeEvent date
International Conference on Machine Learning (ICML)

ELECTR NETWORK

Jul 18-24, 2021

Available on Infoscience
September 25, 2021
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/181608
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés