Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
86 tokens/sec
Gemini 2.5 Pro Premium
40 tokens/sec
GPT-5 Medium
27 tokens/sec
GPT-5 High Premium
32 tokens/sec
GPT-4o
94 tokens/sec
DeepSeek R1 via Azure Premium
94 tokens/sec
GPT OSS 120B via Groq Premium
469 tokens/sec
Kimi K2 via Groq Premium
198 tokens/sec
2000 character limit reached

Convergence rates of non-stationary and deep Gaussian process regression (2312.07320v4)

Published 12 Dec 2023 in math.ST, stat.ME, and stat.TH

Abstract: The focus of this work is the convergence of non-stationary and deep Gaussian process regression. More precisely, we follow a Bayesian approach to regression or interpolation, where the prior placed on the unknown function $f$ is a non-stationary or deep Gaussian process, and we derive convergence rates of the posterior mean to the true function $f$ in terms of the number of observed training points. In some cases, we also show convergence of the posterior variance to zero. The only assumption imposed on the function $f$ is that it is an element of a certain reproducing kernel Hilbert space, which we in particular cases show to be norm-equivalent to a Sobolev space. Our analysis includes the case of estimated hyper-parameters in the covariance kernels employed, both in an empirical Bayes' setting and the particular hierarchical setting constructed through deep Gaussian processes. We consider the settings of noise-free or noisy observations on deterministic or random training points. We establish general assumptions sufficient for the convergence of deep Gaussian process regression, along with explicit examples demonstrating the fulfilment of these assumptions. Specifically, our examples require that the H\"older or Sobolev norms of the penultimate layer are bounded almost surely.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.