Broader noise schedules and practical discretizations

Extend the geometric–distribution separation guarantees to broader noise schedules and practically used discrete samplers by establishing results that account for finite-step implementations, step-size selection, truncated time horizons, and non‑uniform time weighting while preserving the separation between geometry learning and distribution learning.

Background

The theoretical guarantees are derived in a continuous-time framework with an idealized large-noise training condition and a hybrid SDE/ODE sampler. Practical deployments often use discrete-time samplers and varied noise schedules.

The authors highlight the need to carry over the geometry–distribution separation to these realistic settings and training variations.

References

Several directions remain open: Noise schedules, discretizations, and training idealizations. It would be valuable to extend the theory to broader noise schedules and practically used discretizations, including the effects of finite-step samplers, step-size selection, and common training variations (e.g., truncated time horizons or non-uniform time weighting), while preserving a comparable separation between geometry learning and distribution learning.

Manifold Generalization Provably Proceeds Memorization in Diffusion Models  (2603.23792 - Shen et al., 24 Mar 2026) in Conclusion, Open directions (2)