Papers
Topics
Authors
Recent
Search
2000 character limit reached

On stochastic gradient Langevin dynamics with dependent data streams in the logconcave case

Published 6 Dec 2018 in math.ST, math.PR, stat.ML, and stat.TH | (1812.02709v3)

Abstract: We study the problem of sampling from a probability distribution $\pi$ on $\rsetd$ which has a density \wrt\ the Lebesgue measure known up to a normalization factor $x \mapsto \rme{-U(x)} / \int_{\rsetd} \rme{-U(y)} \rmd y$. We analyze a sampling method based on the Euler discretization of the Langevin stochastic differential equations under the assumptions that the potential $U$ is continuously differentiable, $\nabla U$ is Lipschitz, and $U$ is strongly concave. We focus on the case where the gradient of the log-density cannot be directly computed but unbiased estimates of the gradient from possibly dependent observations are available. This setting can be seen as a combination of a stochastic approximation (here stochastic gradient) type algorithms with discretized Langevin dynamics. We obtain an upper bound of the Wasserstein-2 distance between the law of the iterates of this algorithm and the target distribution $\pi$ with constants depending explicitly on the Lipschitz and strong convexity constants of the potential and the dimension of the space. Finally, under weaker assumptions on $U$ and its gradient but in the presence of independent observations, we obtain analogous results in Wasserstein-2 distance.

Citations (35)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.