Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Rényi Divergence in General Hidden Markov Models (2106.01645v1)

Published 3 Jun 2021 in cs.IT and math.IT

Abstract: In this paper, we examine the existence of the R\'enyi divergence between two time invariant general hidden Markov models with arbitrary positive initial distributions. By making use of a Markov chain representation of the probability distribution for the general hidden Markov model and eigenvalue for the associated Markovian operator, we obtain, under some regularity conditions, convergence of the R\'enyi divergence. By using this device, we also characterize the R\'enyi divergence, and obtain the Kullback-Leibler divergence as {\alpha} \rightarrow 1 of the R\'enyi divergence. Several examples, including the classical finite state hidden Markov models, Markov switching models, and recurrent neural networks, are given for illustration. Moreover, we develop a non-Monte Carlo method that computes the R\'enyi divergence of two-state Markov switching models via the underlying invariant probability measure, which is characterized by the Fredholm integral equation.

Citations (1)

Summary

We haven't generated a summary for this paper yet.