Stochastic Gradient Bayesian Optimal Experimental Designs for Simulation-based Inference (2306.15731v1)
Abstract: Simulation-based inference (SBI) methods tackle complex scientific models with challenging inverse problems. However, SBI models often face a significant hurdle due to their non-differentiable nature, which hampers the use of gradient-based optimization techniques. Bayesian Optimal Experimental Design (BOED) is a powerful approach that aims to make the most efficient use of experimental resources for improved inferences. While stochastic gradient BOED methods have shown promising results in high-dimensional design problems, they have mostly neglected the integration of BOED with SBI due to the difficult non-differentiable property of many SBI simulators. In this work, we establish a crucial connection between ratio-based SBI inference algorithms and stochastic gradient-based variational inference by leveraging mutual information bounds. This connection allows us to extend BOED to SBI applications, enabling the simultaneous optimization of experimental designs and amortized inference functions. We demonstrate our approach on a simple linear model and offer implementation details for practitioners.
- Investigating the impact of model misspecification in neural simulation-based inference. doi: 10.48550/arxiv.2209.01845. URL https://arxiv.org/abs/2209.01845v1.
- The frontier of simulation-based inference. Proceedings of the National Academy of Sciences, 117(48):30055–30062, November 2020. ISSN 0027-8424. doi: 10.1073/pnas.1912789117. URL http://arxiv.org/abs/1911.01429. arXiv: 1911.01429 Publisher: Proceedings of the National Academy of Sciences.
- On contrastive learning for likelihood-free inference, February 2020. URL http://arxiv.org/abs/2002.03712. arXiv: 2002.03712 Publication Title: arXiv.
- Variational Bayesian optimal experimental design. arXiv, March 2019a. ISSN 23318422. URL http://arxiv.org/abs/1903.05480. arXiv: 1903.05480 Publisher: arXiv.
- A unified stochastic gradient approach to designing Bayesian-optimal experiments. arXiv, November 2019b. ISSN 23318422. URL http://arxiv.org/abs/1911.00294. arXiv: 1911.00294 Publisher: arXiv.
- Automatic posterior transformation for likelihood-free inference. In International Conference on Machine Learning, pp. 2404–2414. PMLR, 2019.
- A crisis in simulation-based inference? beware, your posterior approximations can be unfaithful. Transactions on Machine Learning Research.
- Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
- Efficient bayesian experimental design for implicit models. In The 22nd International Conference on Artificial Intelligence and Statistics, pp. 476–485. PMLR, 2019.
- Bayesian experimental design for implicit models by mutual information neural estimation, February 2020. URL http://arxiv.org/abs/2002.08129. arXiv: 2002.08129 Publication Title: arXiv.
- Gradient-based Bayesian Experimental Design for Implicit Models using Mutual Information Lower Bounds. May 2021. URL https://arxiv.org/abs/2105.04379v1. arXiv: 2105.04379.
- Lindley, D. V. Bayesian statistics, a review, volume 2. SIAM, 1972.
- Likelihood-free inference with emulator networks. arxiv e-prints. arXiv preprint arXiv:1805.09294, 2018.
- Benchmarking simulation-based inference. In International Conference on Artificial Intelligence and Statistics, pp. 343–351. PMLR, 2021.
- Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748, 2018.
- Fast e-free inference of simulation models with Bayesian conditional density estimation. In Advances in Neural Information Processing Systems, pp. 1036–1044, May 2016. URL http://arxiv.org/abs/1605.06376. arXiv: 1605.06376 Issue: Nips ISSN: 10495258.
- Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows, May 2018. URL http://arxiv.org/abs/1805.07226. arXiv: 1805.07226 Publication Title: arXiv.
- Normalizing Flows for Probabilistic Modeling and Inference. December 2019. URL http://arxiv.org/abs/1912.02762. arXiv: 1912.02762.
- On variational bounds of mutual information. In International Conference on Machine Learning, pp. 5171–5180. PMLR, 2019.
- Overview of approximate bayesian computation. arxiv e-prints, art. arXiv preprint arXiv:1802.09720, 2018.
- Validating bayesian inference algorithms with simulation-based calibration. doi: 10.48550/arxiv.1804.06788. URL https://arxiv.org/abs/1804.06788v2.
- Vincent D. Zaballa (6 papers)
- Elliot E. Hui (7 papers)