Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Regularized Estimation of High-Dimensional Matrix-Variate Autoregressive Models (2410.11320v1)

Published 15 Oct 2024 in stat.ME

Abstract: Matrix-variate time series data are increasingly popular in economics, statistics, and environmental studies, among other fields. This paper develops regularized estimation methods for analyzing high-dimensional matrix-variate time series using bilinear matrix-variate autoregressive models. The bilinear autoregressive structure is widely used for matrix-variate time series, as it reduces model complexity while capturing interactions between rows and columns. However, when dealing with large dimensions, the commonly used iterated least-squares method results in numerous estimated parameters, making interpretation difficult. To address this, we propose two regularized estimation methods to further reduce model dimensionality. The first assumes banded autoregressive coefficient matrices, where each data point interacts only with nearby points. A two-step estimation method is used: first, traditional iterated least-squares is applied for initial estimates, followed by a banded iterated least-squares approach. A Bayesian Information Criterion (BIC) is introduced to estimate the bandwidth of the coefficient matrices. The second method assumes sparse autoregressive matrices, applying the LASSO technique for regularization. We derive asymptotic properties for both methods as the dimensions diverge and the sample size $T\rightarrow\infty$. Simulations and real data examples demonstrate the effectiveness of our methods, comparing their forecasting performance against common autoregressive models in the literature.

Summary

We haven't generated a summary for this paper yet.