Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. NUQSGD: Provably Communication-efficient Data-parallel SGD via Nonuniform Quantization
 
research article

NUQSGD: Provably Communication-efficient Data-parallel SGD via Nonuniform Quantization

Ramezani-Kebrya, Ali
•
Faghri, Fartash
•
Markov, Ilya
Show more
January 1, 2021
Journal of Machine Learning Research

As the size and complexity of models and datasets grow, so does the need for communication-efficient variants of stochastic gradient descent that can be deployed to perform parallel model training. One popular communication-compression method for data-parallel SGD is QSGD (Alistarh et al., 2017), which quantizes and encodes gradients to reduce communication costs. The baseline variant of QSGD provides strong theoretical guarantees, however, for practical purposes, the authors proposed a heuristic variant which we call QSGDinf, which demonstrated impressive empirical gains for distributed training of large neural networks. In this paper, we build on this work to propose a new gradient quantization scheme, and show that it has both stronger theoretical guarantees than QSGD, and matches and exceeds the empirical performance of the QSGDinf heuristic and of other compression methods.

  • Details
  • Metrics
Type
research article
Web of Science ID

WOS:000663169100001

Author(s)
Ramezani-Kebrya, Ali
Faghri, Fartash
Markov, Ilya
Aksenov, Vitalii
Alistarh, Dan  
Roy, Daniel M.
Date Issued

2021-01-01

Publisher

Microtome Publishing

Published in
Journal of Machine Learning Research
Volume

22

Start page

114

Subjects

Automation & Control Systems

•

Computer Science, Artificial Intelligence

•

Computer Science

•

communication-efficient sgd

•

quantization

•

gradient compression

•

data-parallel sgd

•

deep learning

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
DCL  
Available on Infoscience
July 17, 2021
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/180042
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés