Papers
Topics
Authors
Recent
2000 character limit reached

Non-convex sampling for a mixture of locally smooth potentials (2301.13706v1)

Published 31 Jan 2023 in stat.CO

Abstract: The purpose of this paper is to examine the sampling problem through Euler discretization, where the potential function is assumed to be a mixture of locally smooth distributions and weakly dissipative. We introduce $\alpha_{G}$-mixture locally smooth and $\alpha_{H}$-mixture locally Hessian smooth, which are novel and typically satisfied with a mixture of distributions. Under our conditions, we prove the convergence in Kullback-Leibler (KL) divergence with the number of iterations to reach $\epsilon$-neighborhood of a target distribution in only polynomial dependence on the dimension. The convergence rate is improved when the potential is $1$-smooth and $\alpha_{H}$-mixture locally Hessian smooth. Our result for the non-strongly convex outside the ball of radius $R$ is obtained by convexifying the non-convex domains. In addition, we provide some nice theoretical properties of $p$-generalized Gaussian smoothing and prove the convergence in the $L_{\beta}$-Wasserstein distance for stochastic gradients in a general setting.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.