Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data processing theorems and the second law of thermodynamics (1007.2827v1)

Published 16 Jul 2010 in cs.IT, cond-mat.stat-mech, and math.IT

Abstract: We draw relationships between the generalized data processing theorems of Zakai and Ziv (1973 and 1975) and the dynamical version of the second law of thermodynamics, a.k.a. the Boltzmann H-Theorem, which asserts that the Shannon entropy, $H(X_t)$, pertaining to a finite--state Markov process ${X_t}$, is monotonically non-decreasing as a function of time $t$, provided that the steady-state distribution of this process is uniform across the state space (which is the case when the process designates an isolated system). It turns out that both the generalized data processing theorems and the Boltzmann H-Theorem can be viewed as special cases of a more general principle concerning the monotonicity (in time) of a certain generalized information measure applied to a Markov process. This gives rise to a new look at the generalized data processing theorem, which suggests to exploit certain degrees of freedom that may lead to better bounds, for a given choice of the convex function that defines the generalized mutual information.

Citations (23)

Summary

We haven't generated a summary for this paper yet.