Taming correlations through entropy-efficient measure decompositions with applications to mean-field approximation
Abstract: The analysis of various models in statistical physics relies on the existence of decompositions of measures into mixtures of product-like components, where the goal is to attain a decomposition into measures whose entropy is close to that of the original measure, yet with small correlations between coordinates. We prove a related general result: For every isotropic measure $\mu$ on $\mathbb{R}n$ and every $\epsilon > 0$, there exists a decomposition $\mu = \int \mu_\theta d m(\theta)$ such that $H(\mu) - \mathbb{E}{\theta \sim m} H(\mu\theta) \leq n \epsilon$ and $\mathbb{E}{\theta \sim m} \mathrm{Cov}(\mu\theta) \preceq \mathrm{Id}/\epsilon$. As an application, we prove a general bound for the mean-field approximation of Ising and Potts models, which is in a sense dimension free, in both continuous and discrete settings. In particular, for an Ising model on ${\pm 1 }n$ or on $[-1,1]n$, we show that the deficit between the mean-field approximation and the free energy is at most $C \frac{1+p}{p} \left ( n|J|{S_p} \right){\frac{p}{1+p}} $ for all $p>0$, where $|J|{S_p}$ denotes the Schatten-$p$ norm of the interaction matrix. For the case $p=2$, this recovers the result of [Jain et al., 2018], but for an optimal choice of $p$ it often allows to get almost dimension-free bounds.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.