Quantify the condition number χ of the regularized channel

Determine explicit quantitative bounds for the condition number χ of the regularized forward operator K, where χ is defined by χ := sup_{ρ ∈ S_λ} KL(π || ρ) / KL(Kπ || Kρ), under the setting that K is injective, the hypothesis class is parameterized continuously over a compact domain, and the true data distribution π is misspecified (not exactly representable by the model). Provide concrete estimates—beyond mere finiteness—for χ that can be used to sharpen the KL contraction rates established for the self-consistent stochastic interpolant scheme.

Background

In the KL contraction analysis, the authors introduce a condition number χ to relate errors measured in observation space back to data space. Specifically, χ compares KL(π || ρ) to KL(Kπ || Kρ) over a regularized class S_λ induced by constraints on the drift and score models, and it plays a central role in establishing linear convergence rates.

The paper proves that χ is finite when the forward operator K is injective, the parameter space is compact, and the drift/score parametrization is continuous—even in a misspecified setting where π is not exactly representable. However, the authors state they cannot quantify χ under these general assumptions, leaving the development of explicit bounds as an unresolved question that would strengthen the theoretical guarantees (e.g., sharper rates) of the proposed SCSI framework.

References

Unsuprisingly, under such general conditions, we are unable to quantify the condition number.

Generative Modeling from Black-box Corruptions via Self-Consistent Stochastic Interpolants  (2512.10857 - Modi et al., 11 Dec 2025) in Section 4.2 (Contraction in KL Divergence), paragraph after Proposition “Finite condition number for compact hypothesis class”