BayesFlow: Amortized Bayesian Workflows With Neural Networks (2306.16015v2)
Abstract: Modern Bayesian inference involves a mixture of computational techniques for estimating, validating, and drawing conclusions from probabilistic models as part of principled workflows for data analysis. Typical problems in Bayesian workflows are the approximation of intractable posterior distributions for diverse model types and the comparison of competing models of the same process in terms of their complexity and predictive performance. This manuscript introduces the Python library BayesFlow for simulation-based training of established neural network architectures for amortized data compression and inference. Amortized Bayesian inference, as implemented in BayesFlow, enables users to train custom neural networks on model simulations and re-use these networks for any subsequent application of the models. Since the trained networks can perform inference almost instantaneously, the upfront neural network training is quickly amortized.
- “Tensorflow: a system for large-scale machine learning.” In Osdi 16.2016, 2016, pp. 265–283 Savannah, GA, USA
- “Measuring QCD splittings with invertible networks” In SciPost Physics 10.6, 2021, pp. 126
- “Flexible and efficient simulation-based inference for models of decision-making” In Elife 11 eLife Sciences Publications Limited, 2022, pp. e77220
- Paul-Christian Bürkner, Maximilian Scholz and Stefan T. Radev “Some models are useful, but how do we know which ones? Towards a unified Bayesian model taxonomy” arXiv:2209.02439 [stat] In arXiv preprint, 2022 URL: http://arxiv.org/abs/2209.02439
- “A Bayesian brain model of adaptive behavior: an application to the Wisconsin Card Sorting Task” In PeerJ 8 PeerJ Inc., 2020, pp. e10316
- “TensorFlow Distributions”, 2017 arXiv:1711.10604 [cs.LG]
- “A Deep Learning Method for Comparing Bayesian Hierarchical Models” In arXiv preprint arXiv:2301.11873, 2023
- “Bayesian workflow” In arXiv preprint, 2020
- Amin Ghaderi-Kangavari, Jamal Amani Rad and Michael D Nunez “A general integrative neurocognitive modeling framework to jointly describe EEG and decision-making on single trials” PsyArXiv, 2022
- David Greenberg, Marcel Nonnenmacher and Jakob Macke “Automatic posterior transformation for likelihood-free inference” In International Conference on Machine Learning 97, 2019, pp. 2404–2414
- “Towards reliable parameter extraction in MEMS final module testing using bayesian inference” In Sensors 22.14 MDPI, 2022, pp. 5408
- “ELFI: Engine for Likelihood-Free Inference” In Journal of Machine Learning Research 19.16, 2018, pp. 1–7 URL: http://jmlr.org/papers/v19/17-374.html
- “Truncated marginal neural ratio estimation” In Advances in Neural Information Processing Systems 34, 2021, pp. 129–143
- Hee-Seung Moon, Antti Oulasvirta and Byungjoo Lee “Amortized inference with user simulations” In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 2023, pp. 1–20
- Pablo Noever-Castelos, Lynton Ardizzone and Claudio Balzani “Model updating of wind turbine blade cross sections with invertible neural networks” In Wind Energy 25.3 Wiley Online Library, 2022, pp. 573–599
- George Papamakarios, David Sterratt and Iain Murray “Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows” In The 22nd International Conference on Artificial Intelligence and Statistics, 2019, pp. 837–848 PMLR
- “Normalizing Flows for Probabilistic Modeling and Inference” In Journal of Machine Learning Research 22.1 JMLR.org, 2021
- “Amortized Bayesian model comparison with evidential deep learning” In arXiv preprint, 2020
- “BayesFlow: Learning Complex Stochastic Models With Invertible Neural Networks” In IEEE Transactions on Neural Networks and Learning Systems, 2020 DOI: 10.1109/TNNLS.2020.3042395
- “OutbreakFlow: Model-based Bayesian inference of disease outbreak dynamics with invertible neural networks and its application to the COVID-19 pandemics in Germany” In PLoS computational biology 17.10 Public Library of Science San Francisco, CA USA, 2021, pp. e1009472
- “JANA: Jointly Amortized Neural Approximation of Complex Bayesian Models” In arXiv preprint arXiv:2302.09125, 2023
- Teemu Säilynoja, Paul-Christian Bürkner and Aki Vehtari “Graphical test for discrete uniformity and its applications in goodness-of-fit evaluation and multiple sample comparison” In Statistics and Computing 32.2 Springer, 2022, pp. 32
- John Salvatier, Thomas V. Wiecki and Christopher Fonnesbeck “Probabilistic programming in Python using PyMC3” In PeerJ Computer Science 2 PeerJ, 2016, pp. e55 DOI: 10.7717/peerj-cs.55
- Daniel J Schad, Michael Betancourt and Shravan Vasishth “Toward a principled Bayesian workflow in cognitive science.” In Psychological methods 26.1 American Psychological Association, 2021, pp. 103
- “pyABC: Efficient and robust easy-to-use approximate Bayesian computation” In Journal of Open Source Software 7.74 The Open Journal, 2022, pp. 4304 DOI: 10.21105/joss.04304
- “Detecting Model Misspecification in Amortized Bayesian Inference with Neural Networks” In arXiv preprint, 2021, pp. arXiv–2112
- Marvin Schmitt, Stefan T Radev and Paul-Christian Bürkner “Meta-Uncertainty in Bayesian Model Comparison” In arXiv preprint arXiv:2210.07278, 2022
- Takashi Shiono “Estimation of agent-based models using Bayesian deep learning approach of BayesFlow” In Journal of Economic Dynamics and Control 125 Elsevier, 2021, pp. 104082
- “Reliable amortized variational inference with physics-based latent distribution correction” In Geophysics 88.3 Society of Exploration Geophysicists, 2023, pp. R297–R322
- Konstantina Sokratous, Anderson K Fitch and Peter D Kvam “How to ask twenty questions and win: Machine learning tools for assessing preferences from small samples of willingness-to-pay prices” In Journal of Choice Modelling 48 Elsevier, 2023, pp. 100418
- “Validating Bayesian inference algorithms with simulation-based calibration” In arXiv preprint, 2018
- Panagiotis Tsilifis, Sayan Ghosh and Valeria Andreoli “Inverse design under uncertainty using conditional Normalizing Flows” In AIAA Scitech 2022 Forum, 2022, pp. 0631
- “Attention is all you need” In Advances in neural information processing systems 30, 2017
- “Variational inference of fractional Brownian motion with linear computational complexity” In Physical Review E 106.5 APS, 2022, pp. 055311
- Mischa Krause, Stefan T Radev and Andreas Voss “Mental speed is high until age 60 as revealed by analysis of over a million participants” In Nature Human Behaviour 6.5 Nature Publishing Group, 2022, pp. 700–708
- Eva Marie Wieschen, Andreas Voss and Stefan Radev “Jumping to conclusion? a Lévy flight model of decision making” In The Quantitative Methods for Psychology 16.2, 2020, pp. 120–132
- Jice Zeng, Michael D Todd and Zhen Hu “Probabilistic damage detection using a new likelihood-free Bayesian inference method” In Journal of Civil Structural Health Monitoring 13.2-3 Springer, 2023, pp. 319–341
- Stefan T Radev (86 papers)
- Marvin Schmitt (15 papers)
- Lukas Schumacher (3 papers)
- Lasse Elsemüller (4 papers)
- Valentin Pratz (4 papers)
- Yannik Schälte (7 papers)
- Ullrich Köthe (52 papers)
- Paul-Christian Bürkner (58 papers)