Lower bounds for diffusion sample complexity in the tilted-sampling setting
Establish information-theoretic lower bounds on the sample complexity required by denoising diffusion probabilistic models trained on reweighted empirical samples to approximate an exponentially tilted target distribution μ_θ, measured via expected Wasserstein distance. Specifically, determine how the minimal number of base samples N (and its dependence on the tilt parameter θ, data dimension d, and other problem parameters) is bounded below to achieve a prescribed accuracy when using the plug-in self-normalized importance sampler μ_{N,θ} as input to the diffusion model.
References
While Theorem~\ref{thm:DiffWorks} upper bounds the sample complexity of diffusion in the expected Wasserstein distance, the question of what lower bounds are possible is completely open in this setting.