Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Scalable kernel logistic regression with Nyström approximation: Theoretical analysis and application to discrete choice modelling
 
research article

Scalable kernel logistic regression with Nyström approximation: Theoretical analysis and application to discrete choice modelling

Martín-Baos, José Ángel
•
García-Ródenas, Ricardo
•
Rodriguez-Benitez, Luis
Show more
February 7, 2025
Neurocomputing

The application of kernel-based Machine Learning (ML) techniques to discrete choice modelling using large datasets often faces challenges due to memory requirements and the considerable number of parameters involved in these models. This complexity hampers the efficient training of large-scale models. This paper addresses these problems of scalability by introducing the Nyström approximation for Kernel Logistic Regression (KLR) on large datasets. The study begins by presenting a theoretical analysis in which: (i) the set of KLR solutions is characterised, (ii) an upper bound to the solution of KLR with Nyström approximation is provided, and finally (iii) a specialisation of the optimisation algorithms to Nyström KLR is described. After this, the Nyström KLR is computationally validated. Four landmark selection methods are tested, including basic uniform sampling, a k-means sampling strategy, and two non-uniform methods grounded in leverage scores. The performance of these strategies is evaluated using large-scale transport mode choice datasets and is compared with traditional methods such as Multinomial Logit (MNL) and contemporary ML techniques. The study also assesses the efficiency of various optimisation techniques for the proposed Nyström KLR model. The performance of gradient descent, Momentum, Adam, and L-BFGS-B optimisation methods is examined on these datasets. Among these strategies, the k-means Nyström KLR approach emerges as a successful solution for applying KLR to large datasets, particularly when combined with the L-BFGS-B and Adam optimisation methods. The results highlight the ability of this strategy to handle datasets exceeding 200,000 observations while maintaining robust performance.

  • Files
  • Details
  • Metrics
Type
research article
DOI
10.1016/j.neucom.2024.128975
Scopus ID

2-s2.0-85210298332

Author(s)
Martín-Baos, José Ángel

Universidad de Castilla-La Mancha

García-Ródenas, Ricardo

Universidad de Castilla-La Mancha

Rodriguez-Benitez, Luis

Universidad de Castilla-La Mancha

Bierlaire, Michel  

École Polytechnique Fédérale de Lausanne

Date Issued

2025-02-07

Published in
Neurocomputing
Volume

617

Article Number

128975

Subjects

Discrete choice models

•

Kernel logistic regression

•

Low-rank approximation

•

Nyström method

•

Random utility models

•

Reproducing kernel Hilbert spaces

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
TRANSP-OR  
FunderFunding(s)Grant NumberGrant URL

EPFL

University of Castilla-La Mancha

ERDF

2022-GRIN-34249

Available on Infoscience
January 25, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/244479
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés