Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Communication trade-offs for Local-SGD with large step size
 
conference paper

Communication trade-offs for Local-SGD with large step size

Patel, Kumar Kshitij
•
Dieuleveut, Aymeric  
January 1, 2019
Advances In Neural Information Processing Systems 32 (Nips 2019)
33rd Conference on Neural Information Processing Systems (NeurIPS)

Synchronous mini-batch SGD is state-of-the-art for large-scale distributed machine learning. However, in practice, its convergence is bottlenecked by slow communication rounds between worker nodes. A natural solution to reduce communication is to use the "local-SGD" model in which the workers train their model independently and synchronize every once in a while. This algorithm improves the computation-communication trade-off but its convergence is not understood very well. We propose a non-asymptotic error analysis, which enables comparison to one-shot averaging i.e., a single communication round among independent workers, and mini-batch averagingi.e., communicating at every step. We also provide adaptive lower bounds on the communication frequency for large step-sizes (t(-alpha), alpha is an element of(1/2, 1)) and show that local-SGD reduces communication by a factor of O(root T/P-3/2), with T the total number of gradients and P machines.

  • Details
  • Metrics
Type
conference paper
Web of Science ID

WOS:000535866905027

Author(s)
Patel, Kumar Kshitij
Dieuleveut, Aymeric  
Date Issued

2019-01-01

Publisher

NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)

Publisher place

La Jolla

Published in
Advances In Neural Information Processing Systems 32 (Nips 2019)
Series title/Series vol.

Advances in Neural Information Processing Systems

Volume

32

Subjects

Computer Science, Artificial Intelligence

•

Computer Science

•

stochastic-approximation

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
MLO  
Event nameEvent placeEvent date
33rd Conference on Neural Information Processing Systems (NeurIPS)

Vancouver, CANADA

Dec 08-14, 2019

Available on Infoscience
July 10, 2020
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/169968
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés