Dice Question Streamline Icon: https://streamlinehq.com

Independence time in Rényi or chi-squared mutual information

Develop rates of decay and corresponding independence-time bounds for Rényi mutual information and chi-squared mutual information along the Langevin diffusion and the Unadjusted Langevin Algorithm, leveraging that mixing-time guarantees are known in Rényi and chi-squared divergences.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper’s results focus on Shannon mutual information, but mixing time results for Langevin diffusion and ULA are established in other divergences such as Rényi and chi-squared. A mutual-information analogue in these divergences would be the expected Rényi (or chi-squared) divergence between conditional and marginal distributions.

Quantifying decay in these alternative mutual information measures would provide a broader toolkit for independence diagnostics aligned with established mixing analyses.

References

While we study the convergence of mutual information along the Langevin diffusion and ULA, many other interesting questions remain open. Third, as the mixing time guarantees of the Langevin diffusion and the ULA are known in other divergences, such as in the R enyi or $\chi2$-divergences, we can also study the independence time in R enyi or $\chi2$-mutual information.

Characterizing Dependence of Samples along the Langevin Dynamics and Algorithms via Contraction of $Φ$-Mutual Information (2402.17067 - Liang et al., 26 Feb 2024) in Discussion (Section: Discussion)