Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Accelerating Gradient Boosting Machines
 
conference paper

Accelerating Gradient Boosting Machines

Lu, Haihao
•
Karimireddy, Sai Praneeth  
•
Ponomareva, Natalia
Show more
January 1, 2020
International Conference On Artificial Intelligence And Statistics, Vol 108
23rd International Conference on Artificial Intelligence and Statistics (AISTATS)

Gradient Boosting Machine (GBM) introduced by Friedman (2001) is a widely popular ensembling technique and is routinely used in competitions such as Kaggle and the KDD-Cup (Chen and Guestrin, 2016). In this work, we propose an Accelerated Gradient Boosting Machine (AGBM) by incorporating Nesterov's acceleration techniques into the design of GBM. The difficulty in accelerating GBM lies in the fact that weak (inexact) learners are commonly used, and therefore, with naive application, the errors can accumulate in the momentum term. To overcome it, we design a "corrected pseudo residual" that serves as a new target for fitting a weak learner, in order to perform the z-update. Thus, we are able to derive novel computational guarantees for AGBM. This is the first GBM type of algorithm with a theoretically-justified accelerated convergence rate.

  • Details
  • Metrics
Type
conference paper
Web of Science ID

WOS:000559931302019

Author(s)
Lu, Haihao
Karimireddy, Sai Praneeth  
Ponomareva, Natalia
Mirrokni, Vahab
Date Issued

2020-01-01

Publisher

ADDISON-WESLEY PUBL CO

Publisher place

Boston

Published in
International Conference On Artificial Intelligence And Statistics, Vol 108
Series title/Series vol.

Proceedings of Machine Learning Research

Volume

108

Start page

516

End page

525

Subjects

Computer Science, Artificial Intelligence

•

Statistics & Probability

•

Computer Science

•

Mathematics

•

cubic regularization

•

convergence

•

algorithms

•

regression

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
MLO  
Event nameEvent placeEvent date
23rd International Conference on Artificial Intelligence and Statistics (AISTATS)

ELECTR NETWORK

Aug 26-28, 2020

Available on Infoscience
October 25, 2020
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/172732
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés