2000 character limit reached
Optimal Rates for Spectral Algorithms with Least-Squares Regression over Hilbert Spaces (1801.06720v4)
Published 20 Jan 2018 in stat.ML, cs.LG, and math.FA
Abstract: In this paper, we study regression problems over a separable Hilbert space with the square loss, covering non-parametric regression over a reproducing kernel Hilbert space. We investigate a class of spectral/regularized algorithms, including ridge regression, principal component regression, and gradient methods. We prove optimal, high-probability convergence results in terms of variants of norms for the studied algorithms, considering a capacity assumption on the hypothesis space and a general source condition on the target function. Consequently, we obtain almost sure convergence results with optimal rates. Our results improve and generalize previous results, filling a theoretical gap for the non-attainable cases.
- Junhong Lin (29 papers)
- Alessandro Rudi (70 papers)
- Lorenzo Rosasco (144 papers)
- Volkan Cevher (216 papers)