Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. COLA: Decentralized Linear Learning
 
conference paper

COLA: Decentralized Linear Learning

He, Lie
•
Bian, An
•
Jaggi, Martin  
January 1, 2018
Advances In Neural Information Processing Systems 31 (NeurIPS 2018)
32nd Conference on Neural Information Processing Systems (NeurIPS)

Decentralized machine learning is a promising emerging paradigm in view of global challenges of data ownership and privacy. We consider learning of linear classification and regression models, in the setting where the training data is decentralized over many user devices, and the learning algorithm must run on-device, on an arbitrary communication network, without a central coordinator. We propose COLA, a new decentralized training algorithm with strong theoretical guarantees and superior practical performance. Our framework overcomes many limitations of existing methods, and achieves communication efficiency, scalability, elasticity as well as resilience to changes in data and allows for unreliable and heterogeneous participating devices.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

cola-incl-supplementary.pdf

Type

Publisher's Version

Version

Published version

Access type

openaccess

License Condition

CC BY

Size

672.29 KB

Format

Adobe PDF

Checksum (MD5)

58482b7355f5986c829df94ec53c009c

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés