Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 87 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 29 tok/s Pro
GPT-5 High 37 tok/s Pro
GPT-4o 85 tok/s Pro
Kimi K2 183 tok/s Pro
GPT OSS 120B 419 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Posterior Contraction Rates for Gaussian Cox Processes with Non-identically Distributed Data (1906.08799v2)

Published 20 Jun 2019 in math.ST and stat.TH

Abstract: This paper considers the posterior contraction of non-parametric Bayesian inference on non-homogeneous Poisson processes. We consider the quality of inference on a rate function $\lambda$, given non-identically distributed realisations, whose rates are transformations of $\lambda$. Such data arises frequently in practice due, for instance, to the challenges of making observations with limited resources or the effects of weather on detectability of events. We derive contraction rates for the posterior estimates arising from the Sigmoidal Gaussian Cox Process and Quadratic Gaussian Cox Process models. These are popular models where $\lambda$ is modelled as a logistic and quadratic transformation of a Gaussian Process respectively. Our work extends beyond existing analyses in several regards. Firstly, we consider non-identically distributed data, previously unstudied in the Poisson process setting. Secondly, we consider the Quadratic Gaussian Cox Process model, of which there was previously little theoretical understanding. Thirdly, we provide rates on the shrinkage of both the width of balls around the true $\lambda$ in which the posterior mass is concentrated and on the shrinkage of posterior mass outside these balls - usually only the former is explicitly given. Finally, our results hold for certain finite numbers of observations, rather than only asymptotically, and we relate particular choices of hyperparameter/prior to these results.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube