Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian inference on the order of stationary vector autoregressions (2307.05708v2)

Published 11 Jul 2023 in stat.ME

Abstract: Vector autoregressions (VARs) are a widely used tool for modelling multivariate time-series. It is common to assume a VAR is stationary; this can be enforced by imposing the stationarity condition which restricts the parameter space of the autoregressive coefficients to the stationary region. However, implementing this constraint is difficult due to the complex geometry of the stationary region. Fortunately, recent work has provided a solution for autoregressions of fixed order $p$ based on a reparameterization in terms of a set of interpretable and unconstrained transformed partial autocorrelation matrices. In this work, focus is placed on the difficult problem of allowing $p$ to be unknown, developing a prior and computational inference that takes full account of order uncertainty. Specifically, the multiplicative gamma process is used to build a prior which encourages increasing shrinkage of the partial autocorrelations with increasing lag. Identifying the lag beyond which the partial autocorrelations become equal to zero then determines $p$. Based on classic time-series theory, a principled choice of truncation criterion identifies whether a partial autocorrelation matrix is effectively zero. Posterior inference utilizes Hamiltonian Monte Carlo via Stan. The work is illustrated in a substantive application to neural activity data to investigate ultradian brain rhythms.

Citations (2)

Summary

We haven't generated a summary for this paper yet.