General validity of the ESS improvement bound across kernels and couplings
Prove that, for the weight-harmonization algorithm applied to any π-invariant Markov kernel and any valid coupling of that kernel, the one-step expected improvement in effective sample size admits the bound ess_{t+1} ≤ ess_t (1 − \bar{λ}/N)^{-N} and, as the number of chains N→∞, satisfies ess_{t+1} ≲ ess_t exp(\bar{λ}), where \bar{λ} = ((κ₀ − 1)^2)/4 and κ₀ is the ratio of the maximum to minimum initial weight, thereby establishing the general asymptotic improvement structure beyond the perfect-sampling case.
References
This gap, as $N \to \infty$ can be seen as an asymptotic regime of our method: because $(1 -x/n)n$ is an increasing function of $n$ for positive $x$, the achievable improvement has to decrease as $N \to \infty$ until reaching $\exp\curly*{\bar\lambda}$. We conjecture that this structure holds in general for the expected improvement and general kernels and couplings.