conference paper
Diffusion gradient boosting for networked learning
2017
Proceedings ICASSP
Using duality arguments from optimization theory, this work develops an effective distributed gradient boosting strategy for inference and classification by networked clusters of learners. By sharing local dual variables with their immediate neighbors through a diffusion learning protocol, the clusters are able to match the performance of centralized boosting solutions even when the individual clusters only have access to partial information about the feature space.
Type
conference paper
Author(s)
Ying, Bicheng
Date Issued
2017
Published in
Proceedings ICASSP
Start page
2512
End page
2516
Editorial or Peer reviewed
REVIEWED
Written at
OTHER
EPFL units
| Event name | Event date |
New Orleans, LA, 2017 | |
Available on Infoscience
December 19, 2017
Use this identifier to reference this record