Files

Abstract

We address the problem of learning a classifier from distributed data over a number of arbitrarily connected machines without exchange of the datapoints. Our purpose is to train a neural network at each machine as if the entire dataset was locally available. This is accomplished by taking advantage of the so called consensus algorithm for scalar values distributed over a network. We describe an abstract framework for consensus learning and we derive a distributed version for the multilayer feed forward neural networks with back-propagation and early stopping. Tests are performed and results show that with a careful selection of parameters our method performs like the non-distributed. The trade-off is that the total computational effort over all machines is larger.

Details

Actions