Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Decentralized deep learning with arbitrary communication compression
 
conference paper

Decentralized deep learning with arbitrary communication compression

Koloskova, Anastasiia  
•
Lin, Tao  
•
Stich, Sebastian Urban  
Show more
2019
Proceedings of the 8th International Conference on Learning Representations
ICLR 2020 8th International Conference on Learning Representations

Decentralized training of deep learning models is a key element for enabling data privacy and on-device learning over networks, as well as for efficient scaling to large compute clusters. As current approaches are limited by network bandwidth, we propose the use of communication compression in the decentralized training context. We show that Choco-SGD achieves linear speedup in the number of workers for arbitrary high compression ratios on general non-convex functions, and non-IID training data. We demonstrate the practical performance of the algorithm in two key scenarios: the training of deep learning models (i) over decentralized user devices, connected by a peer-to-peer network and (ii) in a datacenter.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

1907.09356 (1).pdf

Type

Publisher's Version

Version

Published version

Access type

openaccess

License Condition

Copyright

Size

7.84 MB

Format

Adobe PDF

Checksum (MD5)

69a975f04db00c56725ad139bd8c9c81

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés