Files

Abstract

n major distributed optimization system, the main bottleneck is often the communication between the different machines. To reduce the time dedicated to communications, some heuristics have been developed to reduce the precision of the messages sent and have been shown to produce good results in practice, and [Alistarh et al, 2017] introduced the quantization framework to analyze theoretically the effects of lossy compression on the convergence rate of gradient descent algorithms. This works identifies an issue in one of the proofs in [Alistarh et al, 2017] and provides a new approach to reduce the error introduced by low-precision updates.

Details

PDF