Optimal Rates for Spectral Algorithms with Least-Squares Regression over Hilbert Spaces

In this paper, we study regression problems over a separable Hilbert space with the square loss, covering non-parametric regression over a reproducing kernel Hilbert space. We investigate a class of spectral/regularized algorithms, including ridge regression, principal component regression, and gradient methods. We prove optimal, high-probability convergence results in terms of variants of norms for the studied algorithms, considering a capacity assumption on the hypothesis space and a general source condition on the target function. Consequently, we obtain almost sure convergence results with optimal rates. Our results improve and generalize previous results, filling a theoretical gap for the non-attainable cases.


Published in:
Applied and Computational Harmonic Analysis
Year:
Oct 04 2018
Keywords:
Laboratories:


Note: The file is under embargo until: 2020-10-04


 Record created 2018-12-27, last modified 2020-04-20

Fulltext:
Download fulltextPDF
POSTPRINT:
Download fulltextPDF
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)