Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Going off grid: Computationally efficient inference for log-Gaussian Cox processes (1111.0641v3)

Published 1 Nov 2011 in stat.CO, math.ST, stat.AP, and stat.TH

Abstract: This paper introduces a new method for performing computational inference on log-Gaussian Cox processes. The likelihood is approximated directly by making novel use of a continuously specified Gaussian random field. We show that for sufficiently smooth Gaussian random field prior distributions, the approximation can converge with arbitrarily high order, while an approximation based on a counting process on a partition of the domain only achieves first-order convergence. The given results improve on the general theory of convergence of the stochastic partial differential equation models, introduced by Lindgren et al. (2011). The new method is demonstrated on a standard point pattern data set and two interesting extensions to the classical log-Gaussian Cox process framework are discussed. The first extension considers variable sampling effort throughout the observation window and implements the method of Chakraborty et al. (2011). The second extension constructs a log-Gaussian Cox process on the world's oceans. The analysis is performed using integrated nested Laplace approximation for fast approximate inference.

Summary

We haven't generated a summary for this paper yet.