Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 152 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 87 tok/s Pro
Kimi K2 204 tok/s Pro
GPT OSS 120B 429 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Adaptive posterior contraction rates for empirical Bayesian drift estimation of a diffusion (1909.12710v2)

Published 27 Sep 2019 in math.ST, stat.ME, and stat.TH

Abstract: Due to their conjugate posteriors, Gaussian process priors are attractive for estimating the drift of stochastic differential equations with continuous time observations. However, their performance strongly depends on the choice of the hyper-parameters. We employ the marginal maximum likelihood estimator to estimate the scaling and/or smoothness parameter(s) of the prior and show that the corresponding posterior has optimal rates of convergence. General theorems do not apply directly to this model as the usual test functions are with respect to a random Hellinger-type metric. We allow for continuous and discrete, one- and two-dimensional sets of hyper-parameters, where optimising over the two-dimensional set of smoothness and scaling hyper-parameters is shown to be beneficial in terms of the adaptive range.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.