Ternary Representation of Stochastic Change and the Origin of Entropy and Its Fluctuations (1902.09536v1)
Abstract: A change in a stochastic system has three representations: Probabilistic, statistical, and informational: (i) is based on random variable $u(\omega)\to\tilde{u}(\omega)$; this induces (ii) the probability distributions $F_u(x)\to F_{\tilde{u}}(x)$, $x\in\mathbb{R}n$; and (iii) a change in the probability measure $\mathbb{P}\to\tilde{\mathbb{P}}$ under the same observable $u(\omega)$. In the informational representation a change is quantified by the Radon-Nikodym derivative $\ln\left( \frac{ d \tilde{\mathbb{P}}}{ d\mathbb{P}}(\omega)\right)=-\ln\left(\frac{ d F_u}{ d F_{\tilde{u}}}(x)\right)$ when $x=u(\omega)$. Substituting a random variable into its own density function creates a fluctuating entropy whose expectation has been given by Shannon. Informational representation of a deterministic transformation on $\mathbb{R}n$ reveals entropic and energetic terms, and the notions of configurational entropy of Boltzmann and Gibbs, and potential of mean force of Kirkwood. Mutual information arises for correlated $u(\omega)$ and $\tilde{u}(\omega)$; and a nonequilibrium thermodynamic entropy balance equation is identified.