Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Deterministic error bounds for kernel-based learning techniques under bounded noise
 
research article

Deterministic error bounds for kernel-based learning techniques under bounded noise

Maddalena, Emilio Tanowe
•
Scharnhorst, Paul  
•
Jones, Colin N.
December 1, 2021
Automatica

We consider the problem of reconstructing a function from a finite set of noise-corrupted samples. Two kernel algorithms are analyzed, namely kernel ridge regression and epsilon-support vector regression. By assuming the ground-truth function belongs to the reproducing kernel Hilbert space of the chosen kernel, and the measurement noise affecting the dataset is bounded, we adopt an approximation theory viewpoint to establish deterministic, finite-sample error bounds for the two models. Finally, we discuss their connection with Gaussian processes and two numerical examples are provided. In establishing our inequalities, we hope to help bring the fields of non-parametric kernel learning and system identification for robust control closer to each other. (C) 2021 Elsevier Ltd. All rights reserved.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

2008.04005.pdf

Type

Preprint

Version

Submitted version (Preprint)

Access type

openaccess

License Condition

n/a

Size

1.74 MB

Format

Adobe PDF

Checksum (MD5)

43f6b21697fa3d5ba2c6358275c797c3

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés