Dice Question Streamline Icon: https://streamlinehq.com

General validity of the ESS improvement bound across kernels and couplings

Prove that, for the weight-harmonization algorithm applied to any π-invariant Markov kernel and any valid coupling of that kernel, the one-step expected improvement in effective sample size admits the bound ess_{t+1} ≤ ess_t (1 − \bar{λ}/N)^{-N} and, as the number of chains N→∞, satisfies ess_{t+1} ≲ ess_t exp(\bar{λ}), where \bar{λ} = ((κ₀ − 1)^2)/4 and κ₀ is the ratio of the maximum to minimum initial weight, thereby establishing the general asymptotic improvement structure beyond the perfect-sampling case.

Information Square Streamline Icon: https://streamlinehq.com

Background

In the perfect-sampling scenario K(x,·)=π(·), the authors derive that the achievable one-step improvement in effective sample size is bounded by ess_{t+1} ≤ ess_t (1 − \bar{λ}/N){-N}, which for large N approximates ess_{t+1} ≲ ess_t exp(\bar{λ}). They conjecture that this improvement structure persists for general π-invariant kernels and couplings.

Proving this general bound would establish fundamental performance limits of weight harmonization across diverse MCMC settings, inform algorithmic design, and justify observed conservativeness in practice.

References

This gap, as $N \to \infty$ can be seen as an asymptotic regime of our method: because $(1 -x/n)n$ is an increasing function of $n$ for positive $x$, the achievable improvement has to decrease as $N \to \infty$ until reaching $\exp\curly*{\bar\lambda}$. We conjecture that this structure holds in general for the expected improvement and general kernels and couplings.

A coupling-based approach to f-divergences diagnostics for Markov chain Monte Carlo (2510.07559 - Corenflos et al., 8 Oct 2025) in Section 6.2 (Perfect sampling example and Rao–Blackwellization)