Fully Quantized Distributed Gradient Descent

n major distributed optimization system, the main bottleneck is often the communication between the different machines. To reduce the time dedicated to communications, some heuristics have been developed to reduce the precision of the messages sent and have been shown to produce good results in practice, and [Alistarh et al, 2017] introduced the quantization framework to analyze theoretically the effects of lossy compression on the convergence rate of gradient descent algorithms. This works identifies an issue in one of the proofs in [Alistarh et al, 2017] and provides a new approach to reduce the error introduced by low-precision updates.


Advisor(s):
Jaggi, Martin
Stich, Sebastian Urban
Year:
2017
Keywords:
Laboratories:




 Record created 2018-01-26, last modified 2018-03-17

Preprint:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)