Ying, BichengSayed, Ali H.2017-12-192017-12-192017-12-19201710.1109/ICASSP.2017.7952609https://infoscience.epfl.ch/handle/20.500.14299/143445Using duality arguments from optimization theory, this work develops an effective distributed gradient boosting strategy for inference and classification by networked clusters of learners. By sharing local dual variables with their immediate neighbors through a diffusion learning protocol, the clusters are able to match the performance of centralized boosting solutions even when the individual clusters only have access to partial information about the feature space.Diffusion gradient boosting for networked learningtext::conference output::conference proceedings::conference paper