Dice Question Streamline Icon: https://streamlinehq.com

Direct proof of the conditional variance inequality without entropy methods

Develop a direct proof, using classical functional inequalities, of the inequality ∑_{m=1}^M E[Var(f(X) | X_{-m})] ≥ (1/(2·κ*)) Var(f(X)) for X distributed according to a log-concave target π satisfying Assumption 2, without relying on relative-entropy-based arguments.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper derives a lower bound on the spectral gap of the Gibbs sampler, which yields the conditional variance inequality ∑{m=1}M E[Var(f(X) | X{-m})] ≥ (1/(2·κ*)) Var(f(X)). This inequality is framed as a variance analogue to the main entropy contraction result.

The authors state that they have not found a direct proof via standard tools (e.g., Poincaré, log-Sobolev, or related functional inequalities) and currently rely on entropy contraction. Establishing a purely functional-inequality-based proof would strengthen the theoretical foundations and potentially extend applicability.

References

To the best of our knowledge, eq:variance_inequality did not appear previously in the literature, and we have not been able to provide a direct proof of it using classical functional inequalities without passing through relative entropy.

eq:variance_inequality:

m=1ME(Var(f(X)Xm))12κVar(f(X)).\sum_{m=1}^M E( \mathrm{Var}(f(X) | X_{-m} ) ) \geq \frac{1}{2\kappa^*} \mathrm{Var}(f(X)).

Entropy contraction of the Gibbs sampler under log-concavity (2410.00858 - Ascolani et al., 1 Oct 2024) in Section 4 (Spectral gap and conditional variances)