Optimal Rates for Spectral Algorithms with Least-Squares Regression over Hilbert Spaces
In this paper, we study regression problems over a separable Hilbert space with the square loss, covering non-parametric regression over a reproducing kernel Hilbert space. We investigate a class of spectral/regularized algorithms, including ridge regression, principal component regression, and gradient methods. We prove optimal, high-probability convergence results in terms of variants of norms for the studied algorithms, considering a capacity assumption on the hypothesis space and a general source condition on the target function. Consequently, we obtain almost sure convergence results with optimal rates. Our results improve and generalize previous results, filling a theoretical gap for the non-attainable cases.
regression_postprint.pdf
Postprint
embargo
2020-10-04
CC BY-NC-ND
421.1 KB
Adobe PDF
6a8a9ff5c83c0660aa929b6cddbf6dc1
regression_hk_v5a.pdf
embargo
2020-10-04
CC BY-NC-ND
399.84 KB
Adobe PDF
ef841b24f5a195bda7d5a91266ee5f1d