Papers
Topics
Authors
Recent
Search
2000 character limit reached

Taming correlations through entropy-efficient measure decompositions with applications to mean-field approximation

Published 28 Nov 2018 in math.PR, math-ph, math.FA, and math.MP | (1811.11530v2)

Abstract: The analysis of various models in statistical physics relies on the existence of decompositions of measures into mixtures of product-like components, where the goal is to attain a decomposition into measures whose entropy is close to that of the original measure, yet with small correlations between coordinates. We prove a related general result: For every isotropic measure $\mu$ on $\mathbb{R}n$ and every $\epsilon > 0$, there exists a decomposition $\mu = \int \mu_\theta d m(\theta)$ such that $H(\mu) - \mathbb{E}{\theta \sim m} H(\mu\theta) \leq n \epsilon$ and $\mathbb{E}{\theta \sim m} \mathrm{Cov}(\mu\theta) \preceq \mathrm{Id}/\epsilon$. As an application, we prove a general bound for the mean-field approximation of Ising and Potts models, which is in a sense dimension free, in both continuous and discrete settings. In particular, for an Ising model on ${\pm 1 }n$ or on $[-1,1]n$, we show that the deficit between the mean-field approximation and the free energy is at most $C \frac{1+p}{p} \left ( n|J|{S_p} \right){\frac{p}{1+p}} $ for all $p>0$, where $|J|{S_p}$ denotes the Schatten-$p$ norm of the interaction matrix. For the case $p=2$, this recovers the result of [Jain et al., 2018], but for an optimal choice of $p$ it often allows to get almost dimension-free bounds.

Authors (1)
Citations (31)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.