Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
92 tokens/sec
Gemini 2.5 Pro Premium
52 tokens/sec
GPT-5 Medium
25 tokens/sec
GPT-5 High Premium
22 tokens/sec
GPT-4o
99 tokens/sec
DeepSeek R1 via Azure Premium
87 tokens/sec
GPT OSS 120B via Groq Premium
457 tokens/sec
Kimi K2 via Groq Premium
252 tokens/sec
2000 character limit reached

Revisiting general source condition in learning over a Hilbert space (2503.20495v1)

Published 26 Mar 2025 in math.ST and stat.TH

Abstract: In Learning Theory, the smoothness assumption on the target function (known as source condition) is a key factor in establishing theoretical convergence rates for an estimator. The existing general form of the source condition, as discussed in learning theory literature, has traditionally been restricted to a class of functions that can be expressed as a product of an operator monotone function and a Lipschitz continuous function. In this note, we remove these restrictions on the index function and establish optimal convergence rates for least-square regression over a Hilbert space with general regularization under a general source condition, thereby significantly broadening the scope of existing theoretical results.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube