He, LieBian, AnJaggi, Martin2019-06-182019-06-182019-06-182018-01-01https://infoscience.epfl.ch/handle/20.500.14299/157234WOS:000461823304054Decentralized machine learning is a promising emerging paradigm in view of global challenges of data ownership and privacy. We consider learning of linear classification and regression models, in the setting where the training data is decentralized over many user devices, and the learning algorithm must run on-device, on an arbitrary communication network, without a central coordinator. We propose COLA, a new decentralized training algorithm with strong theoretical guarantees and superior practical performance. Our framework overcomes many limitations of existing methods, and achieves communication efficiency, scalability, elasticity as well as resilience to changes in data and allows for unreliable and heterogeneous participating devices.Computer Science, Artificial IntelligenceComputer ScienceoptimizationconvergencealgorithmCOLA: Decentralized Linear Learningtext::conference output::conference proceedings::conference paper