Kernel Conjugate Gradient Methods with Random Projections
We propose and study kernel conjugate gradient methods (KCGM) with random projections for least-squares regression over a separable Hilbert space. Considering two types of random projections generated by randomized sketches and Nyström subsampling, we prove optimal statistical results with respect to variants of norms for the algorithms under a suitable stopping rule. Particularly, our results show that if the projection dimension is proportional to the effective dimension of the problem, KCGM with randomized sketches can generalize optimally, while achieving a computational advantage. As a corollary, we derive optimal rates for classic KCGM in the well-conditioned regimes for the case that the target function may not be in the hypothesis space.
YACHA1426.pdf
postprint
embargo
2023-06-09
CC BY-NC-ND
514.05 KB
Adobe PDF
c00f62e8af66d8be2e835ac640f31209
err_sms.pdf
Postprint
openaccess
n/a
23.72 KB
Adobe PDF
8727b0e286cf5b16b50f7477270a5481