Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian inference for latent factor GARCH models (1507.01179v1)

Published 5 Jul 2015 in stat.ME

Abstract: Latent factor GARCH models are difficult to estimate using Bayesian methods because standard Markov chain Monte Carlo samplers produce slowly mixing and inefficient draws from the posterior distributions of the model parameters. This paper describes how to apply the particle Gibbs algorithm to estimate factor GARCH models efficiently. The method has two advantages over previous approaches. First, it generalises in a straightfoward way to models with multiple factors and to various members of the GARCH family. Second, it scales up well as the dimension of the o, bservation vector increases.

Summary

We haven't generated a summary for this paper yet.