Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimal Learning Rates for Regularized Least-Squares with a Fourier Capacity Condition (2204.07856v4)

Published 16 Apr 2022 in math.ST, math.FA, stat.ML, and stat.TH

Abstract: We derive minimax adaptive rates for a new, broad class of Tikhonov-regularized learning problems in Hilbert scales under general source conditions. Our analysis does not require the regression function to be contained in the hypothesis class, and most notably does not employ the conventional \textit{a priori} assumptions on kernel eigendecay. Using the theory of interpolation, we demonstrate that the spectrum of the Mercer operator can be inferred in the presence of ``tight'' $L{\infty}(\mathcal{X})$ embeddings of suitable Hilbert scales. Our analysis utilizes a new Fourier isocapacitary condition, which captures the interplay of the kernel Dirichlet capacities and small ball probabilities via the optimal Hilbert scale function.

Citations (2)

Summary

We haven't generated a summary for this paper yet.