Dice Question Streamline Icon: https://streamlinehq.com

Extend mutual information convergence beyond strong log-concavity via isoperimetry

Determine whether the convergence rates for mutual information between the initial state and the current state established in this work for the Langevin diffusion and the Unadjusted Langevin Algorithm under strong log-concavity continue to hold under weaker assumptions such as isoperimetry (e.g., log-Sobolev-type conditions) that are sufficient for mixing time guarantees.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper establishes exponential decay of mutual information along the Langevin diffusion for strongly log-concave targets and polynomial decay for log-concave targets, and exponential decay for ULA under strong log-concavity and smoothness. In mixing time analysis, isoperimetric inequalities such as log-Sobolev are often sufficient for fast convergence, even without strong log-concavity.

The authors highlight the gap between assumptions used for mixing times and those currently used for their mutual information results, posing whether analogous mutual information convergence can be proved under isoperimetric conditions.

References

While we study the convergence of mutual information along the Langevin diffusion and ULA, many other interesting questions remain open. First, we can study whether our convergence results in mutual information also hold under weaker assumptions than strong log-concavity, such as isoperimetry, which are sufficient for mixing time guarantees.

Characterizing Dependence of Samples along the Langevin Dynamics and Algorithms via Contraction of $Φ$-Mutual Information (2402.17067 - Liang et al., 26 Feb 2024) in Discussion (Section: Discussion)