Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An efficient likelihood-free Bayesian inference method based on sequential neural posterior estimation (2311.12530v3)

Published 21 Nov 2023 in stat.ML, cs.LG, and stat.CO

Abstract: Sequential neural posterior estimation (SNPE) techniques have been recently proposed for dealing with simulation-based models with intractable likelihoods. Unlike approximate Bayesian computation, SNPE techniques learn the posterior from sequential simulation using neural network-based conditional density estimators by minimizing a specific loss function. The SNPE method proposed by Lueckmann et al. (2017) used a calibration kernel to boost the sample weights around the observed data, resulting in a concentrated loss function. However, the use of calibration kernels may increase the variances of both the empirical loss and its gradient, making the training inefficient. To improve the stability of SNPE, this paper proposes to use an adaptive calibration kernel and several variance reduction techniques. The proposed method greatly speeds up the process of training and provides a better approximation of the posterior than the original SNPE method and some existing competitors as confirmed by numerical experiments. We also manage to demonstrate the superiority of the proposed method for a high-dimensional model with real-world dataset.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (52)
  1. C. Andrieu and G. O. Roberts. The pseudo-marginal approach for efficient Monte Carlo computations. The Annals of Statistics, 37(2):697 – 725, 2009.
  2. Particle markov chain monte carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 72(3):269–342, 2010.
  3. S. Barthelmé and N. Chopin. Expectation propagation for likelihood-free inference. Journal of the American Statistical Association, 109(505):315–333, 2014.
  4. Approximate Bayesian computation in population genetics. Genetics, 162(4):2025–2035, 2002.
  5. Adaptive approximate Bayesian computation. Biometrika, 96(4):983–990, 2009.
  6. C. M. Bishop. Mixture density networks. Technical Report NCRG/94/004, Aston University, 1994.
  7. M. G. Blum and O. François. Non-linear regression models for Approximate Bayesian Computation. Statistics and computing, 20:63–73, 2010.
  8. F. V. Bonassi and M. West. Sequential Monte Carlo with adaptive weights for approximate Bayesian computation. Bayesian Analysis, 10(1):171 – 187, 2015.
  9. Mining gold from implicit models to improve likelihood-free inference. Proceedings of the National Academy of Sciences, 117(10):5242–5249, 2020.
  10. The properties of high-dimensional data spaces: implications for exploring gene and protein expression data. Nature reviews cancer, 8(1):37–49, 2008.
  11. The frontier of simulation-based inference. Proceedings of the National Academy of Sciences, 117(48):30055–30062, 2020.
  12. Truncated proposals for scalable and hassle-free simulation-based inference. arXiv preprint arXiv:2210.04815, 2022.
  13. An adaptive sequential Monte Carlo method for approximate Bayesian computation. Statistics and computing, 22:1009–1020, 2012.
  14. Density estimation using real nvp. arXiv preprint arXiv:1605.08803, 2016.
  15. Neural spline flows. Advances in neural information processing systems, 32, 2019.
  16. On contrastive learning for likelihood-free inference. In International conference on machine learning, pages 2771–2781. PMLR, 2020.
  17. R. Erhardt and S. A. Sisson. Modelling extremes using approximate Bayesian computation. Extreme Value Modelling and Risk Analysis, pages 281–306, 2016.
  18. P. Fearnhead and D. Prangle. Constructing summary statistics for approximate Bayesian computation: semi-automatic approximate Bayesian computation. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 74(3):419–474, 2012.
  19. D. T. Gillespie. Exact stochastic simulation of coupled chemical reactions. The Journal of Physical Chemistry, 81(25):2340–2361, 1977.
  20. Variational methods for simulation-based inference. arXiv preprint arXiv:2203.04176, 2022.
  21. Training deep neural density estimators to identify mechanistic models of neural dynamics. Elife, 9:e56261, 2020.
  22. Automatic posterior transformation for likelihood-free inference. In International Conference on Machine Learning, pages 2404–2414. PMLR, 2019.
  23. A kernel two-sample test. The Journal of Machine Learning Research, 13(1):723–773, 2012.
  24. M. U. Gutmann and J. Corander. Bayesian optimization for likelihood-free inference of simulator-based statistical models. Journal of Machine Learning Research, 2016.
  25. Amortized Bayesian inference on generative dynamical network models of epilepsy using deep neural density estimators. Neural Networks, 163:178–194, 2023.
  26. Unbiased MLMC-based variational Bayes for likelihood-free inference. SIAM Journal on Scientific Computing, 44(4):A1884–A1910, 2022.
  27. Likelihood-free mcmc with amortized approximate ratio estimators. In International conference on machine learning, pages 4239–4248. PMLR, 2020.
  28. T. Hesterberg. Weighted average importance sampling and defensive mixture distributions. Technometrics, 37(2):185–194, 1995.
  29. Efficient acquisition rules for model-based approximate Bayesian computation. Bayesian Analysis, 14(2):595 – 622, 2019.
  30. D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  31. Normalizing flows: An introduction and review of current methods. IEEE transactions on pattern analysis and machine intelligence, 43(11):3964–3979, 2020.
  32. D. Lopez-Paz and M. Oquab. Revisiting classifier two-sample tests. arXiv preprint arXiv:1610.06545, 2016.
  33. A. J. Lotka. Analytical note on certain rhythmic relations in organic systems. Proceedings of the National Academy of Sciences, 6(7):410–415, 1920.
  34. Flexible statistical inference for mechanistic models of neural dynamics. Advances in neural information processing systems, 30, 2017.
  35. Likelihood-free inference with emulator networks. In Symposium on Advances in Approximate Bayesian Inference, pages 32–53. PMLR, 2019.
  36. Benchmarking simulation-based inference. In International Conference on Artificial Intelligence and Statistics, pages 343–351. PMLR, 2021.
  37. Approximate Bayesian computational methods. Statistics and computing, 22(6):1167–1180, 2012.
  38. Contrastive Neural Ratio Estimation. arXiv preprint arXiv:2210.06170, 2022.
  39. A. Owen and Y. Zhou. Safe and effective importance sampling. Journal of the American Statistical Association, 95(449):135–143, 2000.
  40. L. Paninski and J. P. Cunningham. Neural data science: accelerating the experiment-analysis-theory cycle in large-scale neuroscience. Current opinion in neurobiology, 50:232–241, 2018.
  41. G. Papamakarios and I. Murray. Fast ε𝜀\varepsilonitalic_ε-free inference of simulation models with bayesian conditional density estimation. Advances in neural information processing systems, 29, 2016.
  42. Masked autoregressive flow for density estimation. Advances in neural information processing systems, 30, 2017.
  43. Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows. In The 22nd International Conference on Artificial Intelligence and Statistics, pages 837–848. PMLR, 2019.
  44. On sequential Monte Carlo, partial rejection control and approximate Bayesian computation. Statistics and Computing, 22:1209–1222, 2012.
  45. Bayesian synthetic likelihood. Journal of Computational and Graphical Statistics, 27(1):1–11, 2018.
  46. Vision-as-inverse-graphics: Obtaining a rich 3d explanation of a scene from a single image. In Proceedings of the IEEE International Conference on Computer Vision Workshops, pages 851–859, 2017.
  47. On Bayesian inference for the M/G/1 queue with efficient MCMC sampling. arXiv preprint arXiv:1401.5548, 2014.
  48. Sequential monte carlo without likelihoods. Proceedings of the National Academy of Sciences, 104(6):1760–1765, 2007.
  49. Likelihood-free inference by ratio estimation. Bayesian Analysis, 17(1):1–31, 2022.
  50. Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems. Journal of the Royal Society Interface, 6(31):187–202, 2009.
  51. E. Veach and L. J. Guibas. Optimally combining sampling techniques for Monte Carlo rendering. In Proceedings of the 22nd annual conference on Computer graphics and interactive techniques, pages 419–428, 1995.
  52. S. N. Wood. Statistical inference for noisy nonlinear ecological dynamic systems. Nature, 466(7310):1102–1104, 2010.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets