Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A theorem of Kalman and minimal state-space realization of Vector Autoregressive Models (1910.02546v1)

Published 6 Oct 2019 in math.ST, cs.SY, eess.SY, math.AG, q-fin.ST, and stat.TH

Abstract: We introduce a concept of $autoregressive$ (AR)state-space realization that could be applied to all transfer functions $\boldsymbol{T}(L)$ with $\boldsymbol{T}(0)$ invertible. We show that a theorem of Kalman implies each Vector Autoregressive model (with exogenous variables) has a minimal $AR$-state-space realization of form $\boldsymbol{y}t = \sum{i=1}p\boldsymbol{H}\boldsymbol{F}{i-1}\boldsymbol{G}\boldsymbol{x}_{t-i}+\boldsymbol{\epsilon}_t$ where $\boldsymbol{F}$ is a nilpotent Jordan matrix and $\boldsymbol{H}, \boldsymbol{G}$ satisfy certain rank conditions. The case $VARX(1)$ corresponds to reduced-rank regression. Similar to that case, for a fixed Jordan form $\boldsymbol{F}$, $\boldsymbol{H}$ could be estimated by least square as a function of $\boldsymbol{G}$. The likelihood function is a determinant ratio generalizing the Rayleigh quotient. It is unchanged if $\boldsymbol{G}$ is replaced by $\boldsymbol{S}\boldsymbol{G}$ for an invertible matrix $\boldsymbol{S}$ commuting with $\boldsymbol{F}$. Using this invariant property, the search space for maximum likelihood estimate could be constrained to equivalent classes of matrices satisfying a number of orthogonal relations, extending the results in reduced-rank analysis. Our results could be considered a multi-lag canonical-correlation-analysis. The method considered here provides a solution in the general case to the polynomial product regression model of Velu et. al. We provide estimation examples. We also explore how the estimates vary with different Jordan matrix configurations and discuss methods to select a configuration. Our approach could provide an important dimensional reduction technique with potential applications in time series analysis and linear system identification. In the appendix, we link the reduced configuration space of $\boldsymbol{G}$ with a geometric object called a vector bundle.

Summary

We haven't generated a summary for this paper yet.