Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 89 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 119 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 460 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Simulation-based stacking (2310.17009v2)

Published 25 Oct 2023 in stat.ME, cs.LG, and stat.CO

Abstract: Simulation-based inference has been popular for amortized Bayesian computation. It is typical to have more than one posterior approximation, from different inference algorithms, different architectures, or simply the randomness of initialization and stochastic gradients. With a consistency guarantee, we present a general posterior stacking framework to make use of all available approximations. Our stacking method is able to combine densities, simulation draws, confidence intervals, and moments, and address the overall precision, calibration, coverage, and bias of the posterior approximation at the same time. We illustrate our method on several benchmark simulations and a challenging cosmological inference task.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. Amortized variational inference for simple hierarchical models. Advances in Neural Information Processing Systems, 34.
  2. Breiman, L. (1996). Stacked regressions. Machine Learning, 24:49–64.
  3. Bayesian model averaging in the M-open framework. In Bayesian Theory and Applications, pages 483–498. Oxford University Press.
  4. Validation of software for Bayesian models using posterior quantiles. Journal of Computational and Graphical Statistics, 15(3):675–692.
  5. Elements of Information Theory. John Wiley & Sons, 2rd edition.
  6. Cramér, H. (1928). On the composition of elementary errors. Scandinavian Actuarial Journal, 1:13–74.
  7. The frontier of simulation-based inference. Proceedings of National Academy of Sciences, 117(48):30055–30062.
  8. Dawid, A. P. (1982). The well-calibrated Bayesian. Journal of American Statistical Association, 77(379):605–610.
  9. Real-time gravitational wave science with neural posterior estimation. Physical Review Letters, 127(24):241103.
  10. Neural spline flows. Advances in Neural Information Processing Systems, 32.
  11. Strictly proper scoring rules, prediction, and estimation. Journal of American Statistical Association, 102(477):359–378.
  12. Training deep neural density estimators to identify mechanistic models of neural dynamics. Elife, 9:e56261.
  13. Automatic posterior transformation for likelihood-free inference. In International Conference on Machine Learning.
  14. A forward modeling approach to analyzing galaxy clustering with simbig. Proceedings of National Academy of Sciences, 120(42):e2218810120.
  15. Bayesian model averaging: a tutorial. Statistical Science, 14(4):382–417.
  16. An adaptive-MCMC scheme for setting trajectory lengths in Hamiltonian Monte Carlo. In International Conference on Artificial Intelligence and Statistics.
  17. Solving high-dimensional parameter inference: marginal posterior densities & moment networks. In NeurIPS Workshop on Machine Learning and the Physical Sciences.
  18. Adam: A method for stochastic optimization. In International Conference on Learning Representations.
  19. A Bayes interpretation of stacking for ℳℳ\mathcal{M}caligraphic_M-complete and ℳℳ\mathcal{M}caligraphic_M-open settings. Bayesian Analysis, 12:807–829.
  20. Benchmarking simulation-based inference. In International Conference on Artificial Intelligence and Statistics.
  21. Simulation-based inference with waldo: Perfectly calibrated confidence regions using any prediction or posterior estimation algorithm. In International Conference on Artificial Intelligence and Statistics.
  22. Scoring rules for continuous probability distributions. Management Science, 22(10):1087–1096.
  23. Embarrassingly parallel MCMC using deep invertible transformations. In Uncertainty in Artificial Intelligence, pages 1244–1252.
  24. Simulation-based calibration checking for Bayesian computation: The choice of test quantities shapes sensitivity. Bayesian Analysis, 1(1):1–28.
  25. Merging MCMC subposteriors through gaussian-process approximations. Bayesian Analysis.
  26. Masked autoregressive flow for density estimation. In Advances in Neural Information Processing Systems.
  27. Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows. In International Conference on Artificial Intelligence and Statistics.
  28. GATSBI: Generative adversarial training for simulation-based inference. In International Conference on Learning Representations.
  29. SimBIG: Galaxy clustering analysis with the wavelet scattering transform. arXiv:2310.15250.
  30. Validating Bayesian inference algorithms with simulation-based calibration. arXiv:1804.06788.
  31. SBI: A toolkit for simulation-based inference. Journal of Open Source Software, 5(52):2505.
  32. van der Vaart, A. W. (1998). Asymptotic Statistics. Cambridge University Press, Cambridge.
  33. A survey of bayesian predictive methods for model assessment, selection and comparison. Statistical Survey.
  34. Wolpert, D. H. (1992). Stacked generalization. Neural Networks, 5:241–259.
  35. Discriminative calibration. arXiv:2305.14593.
  36. Stacking as a way of life: A general framework for combining predictive distributions. Technical report.
  37. Stacking for non-mixing Bayesian computations: The curse and blessing of multimodal posteriors. Journal of Machine Learning Research, 23(1):3426–3471.
  38. Using stacking to average Bayesian predictive distributions (with discussion). Bayesian Analysis, 13:917–1007.
  39. Assessment and adjustment of approximate inference algorithms using the law of total variance. Journal of Computational and Graphical Statistics, 30(4):977–990.
  40. Zhao, R. (2023). The generalized multiplicative gradient method and its convergence rate analysis. arXiv: 2207.13198.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 0 likes.