Fully Quantized Distributed Gradient Descent

n major distributed optimization system, the main bottleneck is often the communication between the different machines. To reduce the time dedicated to communications, some heuristics have been developed to reduce the precision of the messages sent and have been shown to produce good results in practice, and [Alistarh et al, 2017] introduced the quantization framework to analyze theoretically the effects of lossy compression on the convergence rate of gradient descent algorithms. This works identifies an issue in one of the proofs in [Alistarh et al, 2017] and provides a new approach to reduce the error introduced by low-precision updates.


Directeur(s):
Jaggi, Martin
Stich, Sebastian Urban
Année
2017
Mots-clefs:
Laboratoires:




 Notice créée le 2018-01-26, modifiée le 2018-09-13

Preprint:
Télécharger le document
PDF

Évaluer ce document:

Rate this document:
1
2
3
 
(Pas encore évalué)