Papers
Topics
Authors
Recent
Search
2000 character limit reached

Gradient Estimation via Differentiable Metropolis-Hastings

Published 20 Jun 2024 in math.ST, math.PR, stat.CO, and stat.TH | (2406.14451v1)

Abstract: Metropolis-Hastings estimates intractable expectations - can differentiating the algorithm estimate their gradients? The challenge is that Metropolis-Hastings trajectories are not conventionally differentiable due to the discrete accept/reject steps. Using a technique based on recoupling chains, our method differentiates through the Metropolis-Hastings sampler itself, allowing us to estimate gradients with respect to a parameter of otherwise intractable expectations. Our main contribution is a proof of strong consistency and a central limit theorem for our estimator under assumptions that hold in common Bayesian inference problems. The proofs augment the sampler chain with latent information, and formulate the estimator as a stopping tail functional of this augmented chain. We demonstrate our method on examples of Bayesian sensitivity analysis and optimizing a random walk Metropolis proposal.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 2 likes about this paper.