Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MCMC using $\textit{bouncy}$ Hamiltonian dynamics: A unifying framework for Hamiltonian Monte Carlo and piecewise deterministic Markov process samplers (2405.08290v2)

Published 14 May 2024 in stat.CO and stat.ME

Abstract: Piecewise-deterministic Markov process (PDMP) samplers constitute a state of the art Markov chain Monte Carlo (MCMC) paradigm in Bayesian computation, with examples including the zig-zag and bouncy particle sampler (BPS). Recent work on the zig-zag has indicated its connection to Hamiltonian Monte Carlo, a version of the Metropolis algorithm that exploits Hamiltonian dynamics. Here we establish that, in fact, the connection between the paradigms extends far beyond the specific instance. The key lies in (1) the fact that any time-reversible deterministic dynamics provides a valid Metropolis proposal and (2) how PDMPs' characteristic velocity changes constitute an alternative to the usual acceptance-rejection. We turn this observation into a rigorous framework for constructing rejection-free Metropolis proposals based on bouncy Hamiltonian dynamics which simultaneously possess Hamiltonian-like properties and generate discontinuous trajectories similar in appearance to PDMPs. When combined with periodic refreshment of the inertia, the dynamics converge strongly to PDMP equivalents in the limit of increasingly frequent refreshment. We demonstrate the practical implications of this new paradigm, with a sampler based on a bouncy Hamiltonian dynamics closely related to the BPS. The resulting sampler exhibits competitive performance on challenging real-data posteriors involving tens of thousands of parameters.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. Studies in molecular dynamics. i. general method. The Journal of Chemical Physics, 31(2):459–466.
  2. Splitting schemes for second order approximations of piecewise-deterministic Markov processes. arXiv preprint arXiv:2301.02537.
  3. Piecewise deterministic Markov processes for scalable Monte carlo on restricted domains. Statistics & Probability Letters, 136:148–154.
  4. The Zig-Zag process and super-efficient sampling for Bayesian analysis of big data. The Annals of Statistics, 47(3):1288 – 1320.
  5. The boomerang sampler. In International conference on machine learning, pages 908–918. PMLR.
  6. Billingsley, P. (1999). Convergence of probability measures. Wiley Series in Probability and Statistics: Probability and Statistics. John Wiley & Sons Inc., New York, second edition.
  7. Numerical integrators for the hybrid Monte Carlo method. SIAM Journal on Scientific Computing, 36(4):A1556–A1580.
  8. Randomized Hamiltonian Monte Carlo. The Annals of Applied Probability, 27(4):2159–2194.
  9. Geometric integrators and the Hamiltonian Monte Carlo method. Acta Numerica, 27:113–206.
  10. The bouncy particle sampler: A nonreversible rejection-free Markov chain Monte Carlo method. Journal of the American Statistical Association, 113(522):855–867.
  11. Stan: A probabilistic programming language. Journal of statistical software, 76(1).
  12. Davis, M. H. (1984). Piecewise-deterministic Markov processes: A general class of non-diffusion stochastic models. Journal of the Royal Statistical Society: Series B (Methodological), 46(3):353–376.
  13. Randomized Hamiltonian Monte Carlo as scaling limit of the bouncy particle sampler and dimension-free convergence rates. The Annals of Applied Probability, 31(6):2612–2662.
  14. Analysis of a nonreversible Markov chain sampler. Annals of Applied Probability, pages 726–752.
  15. Hybrid Monte Carlo. Physics letters B, 195(2):216–222.
  16. The Hastings algorithm at fifty. Biometrika, 107(1):1–23.
  17. Markov processes: characterization and convergence. John Wiley & Sons.
  18. Compressible generalized hybrid Monte Carlo. The Journal of chemical physics, 140(17):174108.
  19. Piecewise deterministic Markov processes for continuous-time Monte Carlo. Statistical Science, 33(3):386–412.
  20. Riemann manifold Langevin and Hamiltonian Monte Carlo methods. Journal of the Royal Statistical Society Series B: Statistical Methodology, 73(2):123–214.
  21. Gustafson, P. (1998). A guided walk Metropolis algorithm. Statistics and computing, 8:357–364.
  22. The no-u-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo. J. Mach. Learn. Res., 15(1):1593–1623.
  23. MCMC for imbalanced categorical data. Journal of the American Statistical Association, 114(527):1394–1403.
  24. Liu, J. S. (2001). Monte Carlo strategies in scientific computing, volume 75. Springer.
  25. On the geometric ergodicity of Hamiltonian Monte Carlo. Bernoulli, 25(4A):3109 – 3138.
  26. Kinetic energy choice in Hamiltonian/hybrid Monte Carlo. Biometrika, 106(2):303–319.
  27. Splitting methods. Acta Numerica, 11:341–434.
  28. Equation of state calculations by fast computing machines. The journal of chemical physics, 21(6):1087–1092.
  29. Mira, A. et al. (2001). On Metropolis-Hastings algorithms with delayed rejection. Metron, 59(3-4):231–241.
  30. Neal, R. M. et al. (2011). MCMC using Hamiltonian dynamics. Handbook of markov chain monte carlo, 2(11):2.
  31. Discontinuous Hamiltonian Monte Carlo for discrete parameters and discontinuous likelihoods. Biometrika, 107(2):365–380.
  32. Prior-preconditioned conjugate gradient method for accelerated Gibbs sampling in “large n, large p” Bayesian sparse regression. Journal of the American Statistical Association, pages 1–14.
  33. Zigzag path connects two Monte Carlo samplers: Hamiltonian counterpart to a piecewise deterministic Markov process. arXiv preprint arXiv:2104.07694.
  34. Exact Hamiltonian Monte Carlo for truncated multivariate gaussians. Journal of Computational and Graphical Statistics, 23(2):518–542.
  35. Impact of HLA-driven HIV adaptation on virulence in populations of high HIV seroprevalence. Proceedings of the National Academy of Sciences, 111(50):E5393–E5400.
  36. Peters, E. A. et al. (2012). Rejection-free Monte Carlo sampling for general potentials. Physical Review E, 85(2):026703.
  37. CODA: convergence diagnosis and output analysis for MCMC. R news, 6(1):7–11.
  38. Bayesian inference for logistic models using Pólya–gamma latent variables. Journal of the American statistical Association, 108(504):1339–1349.
  39. The central role of the propensity score in observational studies for causal effects. Biometrika, 70(1):41–55.
  40. Probabilistic programming in Python using PyMC3. PeerJ Computer Science, 2:e55.
  41. Split Hamiltonian Monte Carlo. Statistics and Computing, 24:339–349.
  42. A discrete bouncy particle sampler. Biometrika, 109(2):335–349.
  43. Strang, G. (1968). On the construction and comparison of difference schemes. SIAM journal on numerical analysis, 5(3):506–517.
  44. Evaluating large-scale propensity score performance through real-world and synthetic data experiments. International Journal of Epidemiology.
  45. Piecewise-deterministic Markov chain Monte Carlo. arXiv preprint arXiv:1707.05296.
  46. Numerical optimization. Springer Science, 35(67-68):7.
  47. Large-scale inference of correlation among mixed-type biological traits with phylogenetic multivariate probit models. The Annals of Applied Statistics, 15(1):230 – 251.
  48. Accelerating Bayesian inference of dependency between complex biological traits. arXiv preprint arXiv:2201.07291.

Summary

We haven't generated a summary for this paper yet.