Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Diffusion gradient boosting for networked learning
 
conference paper

Diffusion gradient boosting for networked learning

Ying, Bicheng
•
Sayed, Ali H.  
2017
Proceedings ICASSP
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

Using duality arguments from optimization theory, this work develops an effective distributed gradient boosting strategy for inference and classification by networked clusters of learners. By sharing local dual variables with their immediate neighbors through a diffusion learning protocol, the clusters are able to match the performance of centralized boosting solutions even when the individual clusters only have access to partial information about the feature space.

  • Details
  • Metrics
Type
conference paper
DOI
10.1109/ICASSP.2017.7952609
Author(s)
Ying, Bicheng
Sayed, Ali H.  
Date Issued

2017

Published in
Proceedings ICASSP
Start page

2512

End page

2516

Editorial or Peer reviewed

REVIEWED

Written at

OTHER

EPFL units
ASL  
Event nameEvent date
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

New Orleans, LA, 2017

Available on Infoscience
December 19, 2017
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/143445
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés