Extend mutual information convergence beyond strong log-concavity via isoperimetry
Determine whether the convergence rates for mutual information between the initial state and the current state established in this work for the Langevin diffusion and the Unadjusted Langevin Algorithm under strong log-concavity continue to hold under weaker assumptions such as isoperimetry (e.g., log-Sobolev-type conditions) that are sufficient for mixing time guarantees.
References
While we study the convergence of mutual information along the Langevin diffusion and ULA, many other interesting questions remain open. First, we can study whether our convergence results in mutual information also hold under weaker assumptions than strong log-concavity, such as isoperimetry, which are sufficient for mixing time guarantees.
                — Characterizing Dependence of Samples along the Langevin Dynamics and Algorithms via Contraction of $Φ$-Mutual Information
                
                (2402.17067 - Liang et al., 26 Feb 2024) in Discussion (Section: Discussion)