Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Anytime Monte Carlo (1612.03319v3)

Published 10 Dec 2016 in stat.CO and stat.ML

Abstract: Monte Carlo algorithms simulate some prescribed number of samples, taking some random real time to complete the computations necessary. This work considers the converse: to impose a real-time budget on the computation, which results in the number of samples simulated being random. To complicate matters, the real time taken for each simulation may depend on the sample produced, so that the samples themselves are not independent of their number, and a length bias with respect to compute time is apparent. This is especially problematic when a Markov chain Monte Carlo (MCMC) algorithm is used and the final state of the Markov chain -- rather than an average over all states -- is required, which is the case in parallel tempering implementations of MCMC. The length bias does not diminish with the compute budget in this case. It also occurs in sequential Monte Carlo (SMC) algorithms, which is the focus of this paper. We propose an anytime framework to address the concern, using a continuous-time Markov jump process to study the progress of the computation in real time. We first show that for any MCMC algorithm, the length bias of the final state's distribution due to the imposed real-time computing budget can be eliminated by using a multiple chain construction. The utility of this construction is then demonstrated on a large-scale SMC2 implementation, using four billion particles distributed across a cluster of 128 graphics processing units on the Amazon EC2 service. The anytime framework imposes a real-time budget on the MCMC move steps within the SMC${2}$ algorithm, ensuring that all processors are simultaneously ready for the resampling step, demonstrably reducing idleness to due waiting times and providing substantial control over the total compute budget.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. G. Alsmeyer. On the Markov renewal theorem. Stochastic Processes and their Applications, 50:37–56, 1994.
  2. G. Alsmeyer. The Markov renewal theorem and related results. Markov Processes and Related Fields, 3:103–127, 1997.
  3. C. Andrieu and G. O. Roberts. The pseudo-marginal approach for efficient Monte Carlo computations. Annals of Statistics, 37(2):697–725, 04 2009. doi: 10.1214/07-AOS574.
  4. Particle Markov chain Monte Carlo methods. Journal of the Royal Statistical Society B, 72:269–302, 2010. doi: 10.1111/j.1467-9868.2009.00736.x.
  5. Resampling algorithms and architectures for distributed particle filters. IEEE Transactions on Signal Processing, 53:2442–2450, 2005. doi: 10.1109/TSP.2005.849185.
  6. SMC22{}^{2}start_FLOATSUPERSCRIPT 2 end_FLOATSUPERSCRIPT: A sequential Monte Carlo algorithm with particle Markov chain Monte Carlo updates. 2011.
  7. SMC22{}^{2}start_FLOATSUPERSCRIPT 2 end_FLOATSUPERSCRIPT: An efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society B, 75:397–426, 2013. doi: 10.1111/j.1467-9868.2012.01046.x.
  8. Anytime parallel tempering. arXiv.org e-Print archive, 2020.
  9. Ensuring rapid mixing and low bias for asynchronous Gibbs sampling, 2016.
  10. Sequential Monte Carlo samplers. Journal of the Royal Statistical Society B, 68:441–436, 2006. doi: 10.1111/j.1467-9868.2006.00553.x.
  11. R. Douc and O. Cappé. Comparison of resampling schemes for particle filtering. In Image and Signal Processing and Analysis, 2005. ISPA 2005. Proceedings of the 4th International Symposium on, pages 64 – 69, 15-17 2005.
  12. Particle filters for partially observed diffusions. Journal of the Royal Statistical Society B, 70:755–777, 2008. doi: 10.1111/j.1467-9868.2008.00661.x.
  13. Random-weight particle filtering of continuous time processes. Journal of the Royal Statistical Society B, 72(4):497–512, 2010. doi: 10.1111/j.1467-9868.2010.00744.x.
  14. P. W. Glynn and P. Heidelberger. Bias properties of budget constraint simulations. Operations Research, 38(5):801–814, 1990.
  15. P. W. Glynn and P. Heidelberger. Analysis of parallel replicated simulations under a completion time constraint. ACM Transactions on Modeling and Computer Simulations, 1(1):3–23, 1991.
  16. Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proceedings-F, 140:107–113, 1993. doi: 10.1049/ip-f-2.1993.0015.
  17. Solving Ordinary Differential Equations I: Nonstiff Problems. Springer–Verlag, 2 edition, 1993.
  18. P. Heidelberger. Discrete event simulations and parallel processing: Statistical properties. SIAM Journal on Scientific and Statistical Computing, 9(6):1114–1132, 1988. doi: 10.1137/0909077.
  19. Path storage in the particle filter. Statistics and Computing, 25(2):487–496, 2015. ISSN 0960-3174. doi: 10.1007/s11222-013-9445-x.
  20. Low-storage, explicit Runge–Kutta schemes for the compressible Navier-Stokes equations. Applied Numerical Mathematics, 35:177–219, 2000.
  21. G. Kitagawa. Monte Carlo filter and smoother for non-Gaussian nonlinear state space models. Journal of Computational and Graphical Statistics, 5:1–25, 1996. doi: 10.2307/1390750.
  22. P. E. Kloeden and E. Platen. Numerical Solution of Stochastic Differential Equations. Springer–Verlag, 1992.
  23. A. Lee and N. Whiteley. Forest resampling for distributed sequential Monte Carlo. Statistical Analysis and Data Mining: The ASA Data Science Journal, 9(4):230–248, 2016. ISSN 1932-1872. doi: 10.1002/sam.11280.
  24. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Journal of Computational and Graphical Statistics, 19:769–789, 2010. doi: 10.1198/jcgs.2010.10039.
  25. J. S. Liu and R. Chen. Sequential Monte-Carlo methods for dynamic systems. Journal of the American Statistical Association, 93:1032–1044, 1998.
  26. E. N. Lorenz. Predictability of Weather and Climate, chapter 3: Predictability – A Problem Partly Solved, pages 40–58. Cambridge University Press, 2006. doi: 10.1017/CBO9780511617652.004.
  27. T. Meeds and M. Welling. Optimization Monte Carlo: Efficient and embarrassingly parallel likelihood-free inference. In C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, editors, Advances in Neural Information Processing Systems 28, pages 2080–2088. Curran Associates, Inc., 2015.
  28. Stochastic Numerics for Mathematical Physics. Scientific Computation. Springer–Verlag, 2004. doi: 10.1007/978-3-662-10063-9.
  29. L. M. Murray. GPU acceleration of the particle filter: The Metropolis resampler. In DMMD: Distributed machine learning and sparse representation with massive data sets, 2011. URL http://arxiv.org/abs/1202.6163.
  30. L. M. Murray. GPU acceleration of Runge–Kutta integrators. IEEE Transactions on Parallel and Distributed Systems, 23:94–101, 2012. doi: 10.1109/TPDS.2011.61.
  31. L. M. Murray. Bayesian state-space modelling on high-performance hardware using LibBi. Journal of Statistical Software, 67(10):1–36, 2015. ISSN 1548-7660. doi: 10.18637/jss.v067.i10.
  32. Parallel resampling in the particle filter. Journal of Computational and Graphical Statistics, 25(3):789–805, 2016. doi: 10.1080/10618600.2015.1062015.
  33. Approximate Bayesian inference with the weighted likelihood bootstrap. Journal of the Royal Statistical Society B, 56(1):3–48, 1994. ISSN 00359246.
  34. Asynchronous anytime sequential Monte Carlo. In Advances in Neural Information Processing Systems 27, pages 3410–3418. 2014.
  35. Anytime anyspace probabilistic inference. International Journal of Approximate Reasoning, 38(1):53 – 80, 2005. ISSN 0888-613X. doi: 10.1016/j.ijar.2004.04.001.
  36. J. S. Rosenthal. Parallel computing and Monte Carlo algorithms. Far East Journal of Theoretical Statistics, 4:207–236, 2000.
  37. Empirical processes with applications to statistics. Wiley, New York, 1986. ISBN 9780471867258;047186725X;.
  38. Asynchronous Gibbs sampling. 2015. URL https://arxiv.org/abs/1509.08999.
  39. On parallel implementation of sequential Monte Carlo methods: the island particle model. Statistics and Computing, 25(2):243–260, 2015. ISSN 1573-1375. doi: 10.1007/s11222-013-9429-x.
  40. On the role of interaction in sequential Monte Carlo algorithms. Bernoulli, 22(1):494–529, 2016. doi: 10.3150/14-BEJ666.
Citations (6)

Summary

We haven't generated a summary for this paper yet.