Logarithmic Sobolev inequalities in discrete product spaces: a proof by a transportation cost distance (1507.02803v1)
Abstract: The aim of this paper is to prove an inequality between relative entropy and the sum of average conditional relative entropies of the following form: For a fixed probability measure $qn$ on $\mathcal Xn$, ($\mathcal X$ is a finite set), and any probability measure $pn=\mathcal L(Yn)$ on $\mathcal Xn$, we have \begin{equation}\label{} D(pn||qn)\leq Const. \sum_{i=1}n \Bbb E_{pn} D(p_i(\cdot|Y_1,\dots, Y_{i-1},Y_{i+1},\dots, Y_n) || q_i(\cdot|Y_1,\dots, Y_{i-1},Y_{i+1},\dots, Y_n)), \end{equation} where $p_i(\cdot|y_1,\dots, y_{i-1},y_{i+1},\dots, y_n)$ and $q_i(\cdot|x_1,\dots, x_{i-1},x_{i+1},\dots, x_n)$ denote the local specifications for $pn$ resp. $qn$. The constant shall depend on the properties of the local specifications of $qn$. Inequality () is meaningful in product spaces, both in the discrete and the continuous case, and can be used to prove a logarithmic Sobolev inequality for $qn$, provided uniform logarithmic Sobolev inequalities are available for $q_i(\cdot|x_1,\dots, x_{i-1},x_{i+1},\dots, x_n)$, for all fixed $i$ and all fixed $(x_1,\dots, x_{i-1},x_{i+1},\dots, x_n)$. Inequality () directly implies that the Gibbs sampler associated with $qn$ is a contraction for relative entropy. We derive inequality (), and thereby a logarithmic Sobolev inequality, in discrete product spaces, by proving inequalities for an appropriate Wasserstein-like distance. A logarithmic Sobolev inequality is, roughly speaking, a contractivity property of relative entropy with respect to some Markov semigroup. It is much easier to prove contractivity for a distance between measures than for relative entropy, since distances satisfy the triangle inequality, and for them well known linear tools, like estimates through matrix norms can be applied.