Persistent Sampling: Enhancing the Efficiency of Sequential Monte Carlo
Abstract: Sequential Monte Carlo (SMC) samplers are powerful tools for Bayesian inference but suffer from high computational costs due to their reliance on large particle ensembles for accurate estimates. We introduce persistent sampling (PS), an extension of SMC that systematically retains and reuses particles from all prior iterations to construct a growing, weighted ensemble. By leveraging multiple importance sampling and resampling from a mixture of historical distributions, PS mitigates the need for excessively large particle counts, directly addressing key limitations of SMC such as particle impoverishment and mode collapse. Crucially, PS achieves this without additional likelihood evaluations-weights for persistent particles are computed using cached likelihood values. This framework not only yields more accurate posterior approximations but also produces marginal likelihood estimates with significantly lower variance, enhancing reliability in model comparison. Furthermore, the persistent ensemble enables efficient adaptation of transition kernels by leveraging a larger, decorrelated particle pool. Experiments on high-dimensional Gaussian mixtures, hierarchical models, and non-convex targets demonstrate that PS consistently outperforms standard SMC and related variants, including recycled and waste-free SMC, achieving substantial reductions in mean squared error for posterior expectations and evidence estimates, all at reduced computational cost. PS thus establishes itself as a robust, scalable, and efficient alternative for complex Bayesian inference tasks.
- Nicolas Chopin. A sequential particle filter method for static models. Biometrika, 89(3):539–552, 2002.
- Sequential monte carlo samplers. Journal of the Royal Statistical Society Series B: Statistical Methodology, 68(3):411–436, 2006.
- An introduction to sequential Monte Carlo, volume 4. Springer, 2020.
- An overview of existing methods and recent advances in sequential monte carlo. Proceedings of the IEEE, 95(5):899–924, 2007.
- Radford M Neal. Annealed importance sampling. Statistics and computing, 11:125–139, 2001.
- Equation of state calculations by fast computing machines. The journal of chemical physics, 21(6):1087–1092, 1953.
- On the utility of graphics cards to perform massively parallel simulation of advanced monte carlo methods. Journal of computational and graphical statistics, 19(4):769–789, 2010.
- Optimally combining sampling techniques for monte carlo rendering. In Proceedings of the 22nd annual conference on Computer graphics and interactive techniques, pages 419–428, 1995.
- Eric Veach. Robust Monte Carlo methods for light transport simulation. Stanford University, 1998.
- Art Owen and Yi Zhou. Safe and effective importance sampling. Journal of the American Statistical Association, 95(449):135–143, 2000.
- Tim Hesterberg. Weighted average importance sampling and defensive mixture distributions. Technometrics, 37(2):185–194, 1995.
- Adaptive importance sampling: The past, the present, and the future. IEEE Signal Processing Magazine, 34(4):60–79, 2017.
- Adaptive multiple importance sampling. Scandinavian Journal of Statistics, 39(4):798–812, 2012.
- Improving population monte carlo: Alternative weighting and resampling schemes. Signal Processing, 131:77–91, 2017.
- Optimized population monte carlo. IEEE Transactions on Signal Processing, 70:2489–2501, 2022.
- Waste-free sequential monte carlo. Journal of the Royal Statistical Society Series B: Statistical Methodology, 84(1):114–148, 2022.
- Improving smc sampler estimate by recycling all past simulated particles. In 2014 IEEE Workshop on Statistical Signal Processing (SSP), pages 117–120. IEEE, 2014.
- Edwin T Jaynes. Probability theory: The logic of science. Cambridge university press, 2003.
- David JC MacKay. Information theory, inference and learning algorithms. Cambridge university press, 2003.
- Sequential imputations and bayesian missing data problems. Journal of the American statistical association, 89(425):278–288, 1994.
- Inference for lévy-driven stochastic volatility models via adaptive sequential monte carlo. Scandinavian Journal of Statistics, 38(1):1–22, 2011.
- Mathematical methods for physicists: a comprehensive guide. Academic press, 2011.
- On adaptive resampling strategies for sequential Monte Carlo methods. Bernoulli, 18(1):252 – 278, 2012.
- On the convergence of adaptive sequential Monte Carlo methods. The Annals of Applied Probability, 26(2):1111 – 1146, 2016.
- Resampling methods for particle filtering: classification, implementation, and strategies. IEEE Signal processing magazine, 32(3):70–86, 2015.
- Negative association, ordering and convergence of resampling methods. The Annals of Statistics, 47(4):2236–2260, 2019.
- Representations of knowledge in complex systems. Journal of the Royal Statistical Society: Series B (Methodological), 56(4):549–581, 1994.
- Brownian dynamics as smart monte carlo simulation. The Journal of Chemical Physics, 69(10):4628–4633, 1978.
- Exponential convergence of langevin distributions and their discrete approximations. Bernoulli, pages 341–363, 1996.
- Optimal scaling of discrete approximations to langevin diffusions. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 60(1):255–268, 1998.
- Hybrid Monte Carlo. Physics letters B, 195(2):216–222, 1987.
- Radford M Neal. MCMC using Hamiltonian dynamics. In Handbook of Markov Chain Monte Carlo, pages 113–162. Chapman and Hall/CRC, 2011.
- Radford M Neal. Slice sampling. The annals of statistics, 31(3):705–767, 2003.
- Sequential Monte Carlo Samplers with Independent Markov Chain Monte Carlo Proposals. Bayesian Analysis, 14(3):753 – 776, 2019.
- Accelerating astronomical and cosmological inference with preconditioned monte carlo. Monthly Notices of the Royal Astronomical Society, 516(2):1644–1653, 2022.
- Replica monte carlo simulation of spin-glasses. Physical review letters, 57(21):2607, 1986.
- Charles J Geyer et al. Computing science and statistics: Proceedings of the 23rd symposium on the interface. American Statistical Association, New York, 156, 1991.
- Importance tempering. Statistics and Computing, 20:1–7, 2010.
- Axel Finke. On extended state-space constructions for Monte Carlo methods. PhD thesis, University of Warwick, 2015.
- Generalized multiple importance sampling. 2019.
- Pierre Moral. Feynman-Kac formulae: genealogical and interacting particle systems with applications. Springer, 2004.
- Tuning-free generalized hamiltonian monte carlo. In International conference on artificial intelligence and statistics, pages 7799–7813. PMLR, 2022.
- Howard H Rosenbrock. An automatic method for finding the greatest or least value of a function. The computer journal, 3(3):175–184, 1960.
- UCI machine learning repository. 2017.
- Handling sparsity via the horseshoe. In Artificial intelligence and statistics, pages 73–80. PMLR, 2009.
- Bayesian data analysis. Chapman and Hall/CRC, 1995.
- A general framework for the parametrization of hierarchical models. Statistical Science, 22(1):59–73, 2007. ISSN 08834237.
- John Skilling. Nested sampling. Bayesian inference and maximum entropy methods in science and engineering, 735:395–405, 2004.
- John Skilling. Nested sampling for general bayesian computation. 2006.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.