Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
121 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Kullback-Leibler Divergence and Akaike Information Criterion in General Hidden Markov Models (2303.07673v1)

Published 14 Mar 2023 in math.ST and stat.TH

Abstract: To characterize the Kullback-Leibler divergence and Fisher information in general parametrized hidden Markov models, in this paper, we first show that the log likelihood and its derivatives can be represented as an additive functional of a Markovian iterated function system, and then provide explicit characterizations of these two quantities through this representation. Moreover, we show that Kullback-Leibler divergence can be locally approximated by a quadratic function determined by the Fisher information. Results relating to the Cram\'{e}r-Rao lower bound and the H\'{a}jek-Le Cam local asymptotic minimax theorem are also given. As an application of our results, we provide a theoretical justification of using Akaike information criterion (AIC) model selection in general hidden Markov models. Last, we study three concrete models: a Gaussian vector autoregressive-moving average model of order $(p,q)$, recurrent neural networks, and temporal restricted Boltzmann machine, to illustrate our theory.

Summary

We haven't generated a summary for this paper yet.