Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Langevin Monte Carlo for stochastic optimization via landscape modification (2302.03973v1)

Published 8 Feb 2023 in math.PR, cs.LG, math.OC, stat.CO, and stat.ML

Abstract: Given a target function $H$ to minimize or a target Gibbs distribution $\pi_{\beta}0 \propto e{-\beta H}$ to sample from in the low temperature, in this paper we propose and analyze Langevin Monte Carlo (LMC) algorithms that run on an alternative landscape as specified by $Hf_{\beta,c,1}$ and target a modified Gibbs distribution $\pif_{\beta,c,1} \propto e{-\beta Hf_{\beta,c,1}}$, where the landscape of $Hf_{\beta,c,1}$ is a transformed version of that of $H$ which depends on the parameters $f,\beta$ and $c$. While the original Log-Sobolev constant affiliated with $\pi0_{\beta}$ exhibits exponential dependence on both $\beta$ and the energy barrier $M$ in the low temperature regime, with appropriate tuning of these parameters and subject to assumptions on $H$, we prove that the energy barrier of the transformed landscape is reduced which consequently leads to polynomial dependence on both $\beta$ and $M$ in the modified Log-Sobolev constant associated with $\pif_{\beta,c,1}$. This yield improved total variation mixing time bounds and improved convergence toward a global minimum of $H$. We stress that the technique developed in this paper is not only limited to LMC and is broadly applicable to other gradient-based optimization or sampling algorithms.

Citations (1)

Summary

We haven't generated a summary for this paper yet.