Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Debiasing Piecewise Deterministic Markov Process samplers using couplings (2306.15422v2)

Published 27 Jun 2023 in stat.CO, cs.DC, and stat.ME

Abstract: Monte Carlo methods -- such as Markov chain Monte Carlo (MCMC) and piecewise deterministic Markov process (PDMP) samplers -- provide asymptotically exact estimators of expectations under a target distribution. There is growing interest in alternatives to this asymptotic regime, in particular in constructing estimators that are exact in the limit of an infinite amount of computing processors, rather than in the limit of an infinite number of Markov iterations. In particular, Jacob et al. (2020) introduced coupled MCMC estimators to remove the non-asymptotic bias, resulting in MCMC estimators that can be embarrassingly parallelised. In this work, we extend the estimators of Jacob et al. (2020) to the continuous-time context and derive couplings for the bouncy, the boomerang and the coordinate samplers. Some preliminary empirical results are included that demonstrate the reasonable scaling of our method with the dimension of the target.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Explicit convergence bounds for Metropolis Markov chains: isoperimetry, spectral gaps and profiles. arXiv preprint arXiv:2211.08959.
  2. Betancourt, M. (2017). A conceptual introduction to Hamiltonian Monte Carlo. arXiv preprint arXiv:1701.02434.
  3. The Zig-Zag process and super-efficient sampling for Bayesian analysis of big data. The Annals of Statistics, 47(3):1288–1320.
  4. The Boomerang Sampler. In III, H. D. and Singh, A., editors, Proceedings of the 37th International Conference on Machine Learning, volume 119 of Proceedings of Machine Learning Research, pages 908–918. PMLR.
  5. High-dimensional scaling limits of piecewise deterministic sampling algorithms. The Annals of Applied Probability, 32(5):3361 – 3407.
  6. Ergodicity of the ZigZag process. The Annals of Applied Probability, 29(4):2266 – 2301.
  7. Coupled Markov chain Monte Carlo for high-dimensional regression with half-t priors. arXiv preprint ArXiv:2012.04798.
  8. Coupling-based Convergence Assessment of some Gibbs Samplers for High-Dimensional Bayesian Regression with Shrinkage Priors. Journal of the Royal Statistical Society Series B: Statistical Methodology, 84(3):973–996.
  9. Coupling and convergence for Hamiltonian Monte Carlo. The Annals of Applied Probability, 30(3):1209–1250.
  10. The bouncy particle sampler: A nonreversible rejection-free Markov chain Monte Carlo method. Journal of the American Statistical Association, 113(522):855–867.
  11. Lifting Markov chains to speed up mixing. In Proceedings of the thirty-first annual ACM symposium on Theory of computing, pages 275–281.
  12. Automatic Zig-Zag sampling in practice. Statistics and Computing, 32(6):107.
  13. The coupled rejection sampler. arXiv preprint ArXiv:2201.09585.
  14. On the complexity of backward smoothing algorithms. arXiv preprint arXiv:2207.00976 v2.
  15. Davis, M. H. (1984). Piecewise-deterministic Markov processes: A general class of non-diffusion stochastic models. Journal of the Royal Statistical Society: Series B (Methodological), 46(3):353–376.
  16. Exponential ergodicity of the bouncy particle sampler. Annals of Statistics, 47(3).
  17. Randomized Hamiltonian Monte Carlo as scaling limit of the bouncy particle sampler and dimension-free convergence rates. The Annals of Applied Probability, 31(6):2612 – 2662.
  18. The total variation distance between high-dimensional Gaussians with the same mean. arXiv preprint ArXiv:1810.08693.
  19. Analysis of a nonreversible Markov chain sampler. The Annals of Applied Probability, 10(3):726 – 752.
  20. Infinite dimensional piecewise deterministic Markov processes. arXiv preprint ArXiv:2205.11452.
  21. Markov Chains. Springer International Publishing.
  22. Hybrid Monte Carlo. Physics Letters B, 195(2):216–222.
  23. Geometric ergodicity of the bouncy particle sampler. The Annals of Applied Probability, 30(5):2069–2098.
  24. Piecewise deterministic Markov processes and their invariant measures. Annales de l’Institut Henri Poincaré, Probabilités et Statistiques, 57(3):1442 – 1475.
  25. Piecewise deterministic Markov processes for continuous-time Monte Carlo. Statistical Science, 33(3):386–412.
  26. On the optimality and efficiency of common random numbers. Mathematics and computers in simulation, 26(6):502–512.
  27. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on pattern analysis and machine intelligence, PAMI-6(6):721–741.
  28. Giles, M. B. (2015). Multilevel Monte Carlo methods. Acta Numerica, 24:259–328.
  29. Exact estimation for Markov chain equilibrium expectations. Journal of Applied Probability, 51(A):377–389.
  30. Coupling control variates for Markov chain Monte Carlo. Journal of Computational Physics, 228(19):7127–7136.
  31. A new Monte Carlo technique: antithetic variates. Mathematical proceedings of the Cambridge philosophical society, 52(3):449–475.
  32. Hastings, W. K. (1970). Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57(1):97–109.
  33. Unbiased Hamiltonian Monte Carlo with couplings. Biometrika, 106(2):287–302.
  34. Huber, M. L. (2016). Perfect simulation, volume 148. CRC Press.
  35. Smoothing with couplings of conditional particle filters. Journal of the American Statistical Association, 115(530):721–729.
  36. Unbiased Markov chain Monte Carlo methods with couplings. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 82(3):543–600.
  37. Lindvall, T. (2002). Lectures on the Coupling Method. Dover.
  38. Equation of state calculations by fast computing machines. The journal of chemical physics, 21(6):1087–1092.
  39. Unbiased Markov chain Monte Carlo for intractable target distributions. Electronic Journal of Statistics, 14:2842–2891.
  40. Many processors, little time: MCMC for partitions via optimal transport couplings. In International Conference on Artificial Intelligence and Statistics, pages 3483–3514. PMLR.
  41. NuZZ: numerical Zig-Zag sampling for general models. arXiv preprint ArXiv:2003.03636 v2.
  42. Rejection-free Monte Carlo sampling for general potentials. Physical review. E, Statistical, nonlinear, and soft matter physics, 85(2 Pt 2):026703.
  43. Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures & Algorithms, 9(1-2):223–252.
  44. Unbiased estimation with square root convergence for SDE models. Operations Research, 63(5):1026–1043.
  45. Monte Carlo statistical methods, volume 2. Springer.
  46. Optimal scaling for various Metropolis-Hastings algorithms. Statistical Science, 16(4):351 – 367.
  47. Concave-convex PDMP-based sampling. Journal of Computational and Graphical Statistics, 0(0):1–11.
  48. Thorisson, H. (2000). Coupling, Stationarity, and Regeneration. Springer-Verlag.
  49. Villani, C. (2009). Optimal transport: old and new, volume 338. Springer.
  50. Maximal couplings of the metropolis-hastings algorithm. In Banerjee, A. and Fukumizu, K., editors, Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, volume 130 of Proceedings of Machine Learning Research, pages 1225–1233. PMLR.
  51. The Coordinate Sampler: A Non-Reversible Gibbs-like MCMC Sampler. Statistics and Computing, 30:721–730.
Citations (1)

Summary

We haven't generated a summary for this paper yet.