Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Persistent Sampling: Unleashing the Potential of Sequential Monte Carlo (2407.20722v1)

Published 30 Jul 2024 in stat.ML, cs.LG, and stat.CO

Abstract: Sequential Monte Carlo (SMC) methods are powerful tools for Bayesian inference but suffer from requiring many particles for accurate estimates, leading to high computational costs. We introduce persistent sampling (PS), an extension of SMC that mitigates this issue by allowing particles from previous iterations to persist. This generates a growing, weighted ensemble of particles distributed across iterations. In each iteration, PS utilizes multiple importance sampling and resampling from the mixture of all previous distributions to produce the next generation of particles. This addresses particle impoverishment and mode collapse, resulting in more accurate posterior approximations. Furthermore, this approach provides lower-variance marginal likelihood estimates for model comparison. Additionally, the persistent particles improve transition kernel adaptation for efficient exploration. Experiments on complex distributions show that PS consistently outperforms standard methods, achieving lower squared bias in posterior moment estimation and significantly reduced marginal likelihood errors, all at a lower computational cost. PS offers a robust, efficient, and scalable framework for Bayesian inference.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (49)
  1. Nicolas Chopin. A sequential particle filter method for static models. Biometrika, 89(3):539–552, 2002.
  2. Sequential monte carlo samplers. Journal of the Royal Statistical Society Series B: Statistical Methodology, 68(3):411–436, 2006.
  3. An introduction to sequential Monte Carlo, volume 4. Springer, 2020.
  4. An overview of existing methods and recent advances in sequential monte carlo. Proceedings of the IEEE, 95(5):899–924, 2007.
  5. Radford M Neal. Annealed importance sampling. Statistics and computing, 11:125–139, 2001.
  6. Equation of state calculations by fast computing machines. The journal of chemical physics, 21(6):1087–1092, 1953.
  7. On the utility of graphics cards to perform massively parallel simulation of advanced monte carlo methods. Journal of computational and graphical statistics, 19(4):769–789, 2010.
  8. Optimally combining sampling techniques for monte carlo rendering. In Proceedings of the 22nd annual conference on Computer graphics and interactive techniques, pages 419–428, 1995.
  9. Eric Veach. Robust Monte Carlo methods for light transport simulation. Stanford University, 1998.
  10. Art Owen and Yi Zhou. Safe and effective importance sampling. Journal of the American Statistical Association, 95(449):135–143, 2000.
  11. Tim Hesterberg. Weighted average importance sampling and defensive mixture distributions. Technometrics, 37(2):185–194, 1995.
  12. Adaptive importance sampling: The past, the present, and the future. IEEE Signal Processing Magazine, 34(4):60–79, 2017.
  13. Adaptive multiple importance sampling. Scandinavian Journal of Statistics, 39(4):798–812, 2012.
  14. Improving population monte carlo: Alternative weighting and resampling schemes. Signal Processing, 131:77–91, 2017.
  15. Optimized population monte carlo. IEEE Transactions on Signal Processing, 70:2489–2501, 2022.
  16. Waste-free sequential monte carlo. Journal of the Royal Statistical Society Series B: Statistical Methodology, 84(1):114–148, 2022.
  17. Improving smc sampler estimate by recycling all past simulated particles. In 2014 IEEE Workshop on Statistical Signal Processing (SSP), pages 117–120. IEEE, 2014.
  18. Edwin T Jaynes. Probability theory: The logic of science. Cambridge university press, 2003.
  19. David JC MacKay. Information theory, inference and learning algorithms. Cambridge university press, 2003.
  20. Sequential imputations and bayesian missing data problems. Journal of the American statistical association, 89(425):278–288, 1994.
  21. Inference for lévy-driven stochastic volatility models via adaptive sequential monte carlo. Scandinavian Journal of Statistics, 38(1):1–22, 2011.
  22. Mathematical methods for physicists: a comprehensive guide. Academic press, 2011.
  23. On adaptive resampling strategies for sequential Monte Carlo methods. Bernoulli, 18(1):252 – 278, 2012.
  24. On the convergence of adaptive sequential Monte Carlo methods. The Annals of Applied Probability, 26(2):1111 – 1146, 2016.
  25. Resampling methods for particle filtering: classification, implementation, and strategies. IEEE Signal processing magazine, 32(3):70–86, 2015.
  26. Negative association, ordering and convergence of resampling methods. The Annals of Statistics, 47(4):2236–2260, 2019.
  27. Representations of knowledge in complex systems. Journal of the Royal Statistical Society: Series B (Methodological), 56(4):549–581, 1994.
  28. Brownian dynamics as smart monte carlo simulation. The Journal of Chemical Physics, 69(10):4628–4633, 1978.
  29. Exponential convergence of langevin distributions and their discrete approximations. Bernoulli, pages 341–363, 1996.
  30. Optimal scaling of discrete approximations to langevin diffusions. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 60(1):255–268, 1998.
  31. Hybrid Monte Carlo. Physics letters B, 195(2):216–222, 1987.
  32. Radford M Neal. MCMC using Hamiltonian dynamics. In Handbook of Markov Chain Monte Carlo, pages 113–162. Chapman and Hall/CRC, 2011.
  33. Radford M Neal. Slice sampling. The annals of statistics, 31(3):705–767, 2003.
  34. Sequential Monte Carlo Samplers with Independent Markov Chain Monte Carlo Proposals. Bayesian Analysis, 14(3):753 – 776, 2019.
  35. Accelerating astronomical and cosmological inference with preconditioned monte carlo. Monthly Notices of the Royal Astronomical Society, 516(2):1644–1653, 2022.
  36. Replica monte carlo simulation of spin-glasses. Physical review letters, 57(21):2607, 1986.
  37. Charles J Geyer et al. Computing science and statistics: Proceedings of the 23rd symposium on the interface. American Statistical Association, New York, 156, 1991.
  38. Importance tempering. Statistics and Computing, 20:1–7, 2010.
  39. Axel Finke. On extended state-space constructions for Monte Carlo methods. PhD thesis, University of Warwick, 2015.
  40. Generalized multiple importance sampling. 2019.
  41. Pierre Moral. Feynman-Kac formulae: genealogical and interacting particle systems with applications. Springer, 2004.
  42. Tuning-free generalized hamiltonian monte carlo. In International conference on artificial intelligence and statistics, pages 7799–7813. PMLR, 2022.
  43. Howard H Rosenbrock. An automatic method for finding the greatest or least value of a function. The computer journal, 3(3):175–184, 1960.
  44. UCI machine learning repository. 2017.
  45. Handling sparsity via the horseshoe. In Artificial intelligence and statistics, pages 73–80. PMLR, 2009.
  46. Bayesian data analysis. Chapman and Hall/CRC, 1995.
  47. A general framework for the parametrization of hierarchical models. Statistical Science, 22(1):59–73, 2007. ISSN 08834237.
  48. John Skilling. Nested sampling. Bayesian inference and maximum entropy methods in science and engineering, 735:395–405, 2004.
  49. John Skilling. Nested sampling for general bayesian computation. 2006.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com