Leveraging Donsker–Varadhan for General Rate–Distortion Lower Bounds
Investigate and characterize to what extent the Donsker–Varadhan variational representation for Kullback–Leibler divergence can be leveraged to derive general rate–distortion lower bounds for the information-theoretic learning framework presented in the paper. Develop conditions and analytic techniques that yield broadly applicable lower bounds beyond the linear regression case.
References
To what extent tools such as the Donsker-Varadhan lower bound can be leveraged to derive general rate-distortion lower bounds is left as future work.
— Information-Theoretic Foundations for Machine Learning
(2407.12288 - Jeon et al., 17 Jul 2024) in Conclusion → Future Research