Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Divergence Inequalities with Applications in Ergodic Theory (2411.17241v3)

Published 26 Nov 2024 in cs.IT, math.IT, and quant-ph

Abstract: The data processing inequality is central to information theory and motivates the study of monotonic divergences. However, it is not clear operationally we need to consider all such divergences. We establish a simple method for Pinsker inequalities as well as general bounds in terms of $\chi{2}$-divergences for twice-differentiable $f$-divergences. These tools imply new relations for input-dependent contraction coefficients. We use these relations to show for many $f$-divergences the rate of contraction of a time homogeneous Markov chain is characterized by the input-dependent contraction coefficient of the $\chi{2}$-divergence. This is efficient to compute and the fastest it could converge for a class of divergences. We show similar ideas hold for mixing times. Moreover, we extend these results to the Petz $f$-divergences in quantum information theory, albeit without any guarantee of efficient computation. These tools may have applications in other settings where iterative data processing is relevant.

Citations (1)

Summary

We haven't generated a summary for this paper yet.