Relative Entropy and Mutual Information in Gaussian Statistical Field Theory (2307.15548v2)
Abstract: Relative entropy is a powerful measure of the dissimilarity between two statistical field theories in the continuum. In this work, we study the relative entropy between Gaussian scalar field theories in a finite volume with different masses and boundary conditions. We show that the relative entropy depends crucially on $d$, the dimension of Euclidean space. Furthermore, we demonstrate that the mutual information between two disjoint regions in $\mathbb{R}d$ is finite if the two regions are separated by a finite distance and satisfies an area law. We then construct an example of "touching" regions between which the mutual information is infinite. We argue that the properties of mutual information in scalar field theories can be explained by the Markov property of these theories.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.