Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Extended Simplified Laplace strategy for Approximate Bayesian inference of Latent Gaussian Models using R-INLA (2203.14304v1)

Published 27 Mar 2022 in stat.ME

Abstract: Various computational challenges arise when applying Bayesian inference approaches to complex hierarchical models. Sampling-based inference methods, such as Markov Chain Monte Carlo strategies, are renowned for providing accurate results but with high computational costs and slow or questionable convergence. On the contrary, approximate methods like the Integrated Nested Laplace Approximation (INLA) construct a deterministic approximation to the univariate posteriors through nested Laplace Approximations. This method enables fast inference performance in Latent Gaussian Models, which encode a large class of hierarchical models. R-INLA software mainly consists of three strategies to compute all the required posterior approximations depending on the accuracy requirements. The Simplified Laplace approximation (SLA) is the most attractive because of its speed performance since it is based on a Taylor expansion up to order three of a full Laplace Approximation. Here we enhance the methodology by simplifying the computations necessary for the skewness and modal configuration. Then we propose an expansion up to order four and use the Extended Skew Normal distribution as a new parametric fit. The resulting approximations to the marginal posterior densities are more accurate than those calculated with the SLA, with essentially no additional cost.

Summary

We haven't generated a summary for this paper yet.