Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generalization bounds for nonparametric regression with $β-$mixing samples (2108.00997v1)

Published 2 Aug 2021 in math.ST, stat.ML, and stat.TH

Abstract: In this paper we present a series of results that permit to extend in a direct manner uniform deviation inequalities of the empirical process from the independent to the dependent case characterizing the additional error in terms of $\beta-$mixing coefficients associated to the training sample. We then apply these results to some previously obtained inequalities for independent samples associated to the deviation of the least-squared error in nonparametric regression to derive corresponding generalization bounds for regression schemes in which the training sample may not be independent. These results provide a framework to analyze the error associated to regression schemes whose training sample comes from a large class of $\beta-$mixing sequences, including geometrically ergodic Markov samples, using only the independent case. More generally, they permit a meaningful extension of the Vapnik-Chervonenkis and similar theories for independent training samples to this class of $\beta-$mixing samples.

Citations (1)

Summary

We haven't generated a summary for this paper yet.