Papers
Topics
Authors
Recent
Search
2000 character limit reached

Consistency Models for Scalable and Fast Simulation-Based Inference

Published 9 Dec 2023 in cs.LG, cs.AI, and stat.ML | (2312.05440v3)

Abstract: Simulation-based inference (SBI) is constantly in search of more expressive and efficient algorithms to accurately infer the parameters of complex simulation models. In line with this goal, we present consistency models for posterior estimation (CMPE), a new conditional sampler for SBI that inherits the advantages of recent unconstrained architectures and overcomes their sampling inefficiency at inference time. CMPE essentially distills a continuous probability flow and enables rapid few-shot inference with an unconstrained architecture that can be flexibly tailored to the structure of the estimation problem. We provide hyperparameters and default architectures that support consistency training over a wide range of different dimensions, including low-dimensional ones which are important in SBI workflows but were previously difficult to tackle even with unconditional consistency models. Our empirical evaluation demonstrates that CMPE not only outperforms current state-of-the-art algorithms on hard low-dimensional benchmarks, but also achieves competitive performance with much faster sampling speed on two realistic estimation problems with high data and/or parameter dimensions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (63)
  1. Analyzing inverse problems with invertible neural networks. In Intl. Conf. on Learning Representations, 2019.
  2. Neural networks enable efficient and accurate simulation-based inference of evolutionary parameters from adaptation dynamics. PLoS Biology, 20(5):e3001633, 2022.
  3. Conditional image generation with score-based diffusion models, 2021.
  4. Measuring qcd splittings with invertible networks. SciPost Physics, 10(6):126, 2021.
  5. Simulation-based inference for efficient identification of generative models in computational connectomics. February 2023. doi: 10.1101/2023.01.31.526269. URL https://doi.org/10.1101/2023.01.31.526269.
  6. Statistical inference for generative models with maximum mean discrepancy, 2019.
  7. Some models are useful, but how do we know which ones? towards a unified bayesian model taxonomy. Statistics Surveys, 17(none), 2023. ISSN 1935-7516. doi: 10.1214/23-ss145. URL http://dx.doi.org/10.1214/23-SS145.
  8. Machine learning and lhc event generation. arXiv preprint arXiv:2203.07460, 2022.
  9. The frontier of simulation-based inference. Proceedings of the National Academy of Sciences, 2020.
  10. Neural importance sampling for rapid and reliable gravitational-wave inference. Physical Review Letters, 130(17), April 2023a. ISSN 1079-7114. doi: 10.1103/physrevlett.130.171403. URL http://dx.doi.org/10.1103/PhysRevLett.130.171403.
  11. Flow matching for scalable simulation-based inference, 2023b.
  12. Truncated proposals for scalable and hassle-free simulation-based inference. arXiv preprint, 2022.
  13. Density estimation using real nvp. arXiv preprint arXiv:1605.08803, 2016.
  14. Neural spline flows. Advances in neural information processing systems, 32, 2019.
  15. On contrastive learning for likelihood-free inference. In International Conference on Machine Learning. PMLR, 2020.
  16. Sensitivity-aware amortized Bayesian inference, 2023a. arXiv:2310.11122.
  17. A deep learning method for comparing bayesian hierarchical models, 2023b.
  18. Ddsp: Differentiable digital signal processing. In International Conference on Learning Representations, 2020.
  19. Compositional score modeling for simulation-based inference, 2022.
  20. A general integrative neurocognitive modeling framework to jointly describe EEG and decision-making on single trials. Computational Brain and Behavior, 2023. doi: 10.1007/s42113-023-00167-4.
  21. Training deep neural density estimators to identify mechanistic models of neural dynamics. Elife, 2020.
  22. Generative adversarial nets. In Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N., and Weinberger, K. (eds.), Advances in Neural Information Processing Systems, volume 27. Curran Associates, Inc., 2014.
  23. Automatic posterior transformation for likelihood-free inference. In International Conference on Machine Learning, 2019.
  24. A Kernel Two-Sample Test. The Journal of Machine Learning Research, 13:723–773, 2012.
  25. Towards reliable parameter extraction in mems final module testing using bayesian inference. Sensors, 22(14):5408, July 2022. ISSN 1424-8220. doi: 10.3390/s22145408. URL http://dx.doi.org/10.3390/s22145408.
  26. Denoising diffusion probabilistic models. In Advances in Neural Information Processing Systems, volume 33, pp.  6840–6851, 2020.
  27. Simulation of Charged Systems in Heterogeneous Dielectric Media via a True Energy Functional. Physical Review Letters, 109(22), 2012. doi: 10.1103/physrevlett.109.223905.
  28. Parallelization and high-performance computing enables automated statistical inference of multi-scale models. Cell systems, 4(2):194–206, 2017.
  29. Machine learning surrogates for molecular dynamics simulations of soft materials. Journal of Computational Science, 42:101107, 2020. doi: 10.1016/j.jocs.2020.101107.
  30. Elucidating the design space of diffusion-based generative models. Advances in Neural Information Processing Systems, 35:26565–26577, 2022.
  31. Köthe, U. A review of change of variable formulas for generative modeling. arXiv preprint arXiv:2308.02652, 2023.
  32. Benchmarking invertible architectures on inverse problems. 2021. Workshop on Invertible Neural Networks and Normalizing Flows (ICML 2019).
  33. Simulation intelligence: Towards a new generation of scientific methods. arXiv preprint, 2021.
  34. Deep learning. Nature, 521(7553):436–444, 2015. ISSN 1476-4687. doi: 10.1038/nature14539. URL http://dx.doi.org/10.1038/nature14539.
  35. Flow matching for generative modeling. In The 11th International Conference on Learning Representations, 2023. URL https://openreview.net/forum?id=PqvMRDCJT9t.
  36. Flow straight and fast: Learning to generate and transfer data with rectified flow, 2022.
  37. Flexible statistical inference for mechanistic models of neural dynamics. In 31st NeurIPS Conference Proceedings, 2017.
  38. Benchmarking simulation-based inference. arXiv preprint, 2021.
  39. Nain, A. K. Keras documentation: Denoising diffusion probabilistic model. https://keras.io/examples/generative/ddpm/, 2022. Accessed: 2023-11-27.
  40. Score matched neural exponential families for likelihood-free inference. J. Mach. Learn. Res., 23:38–1, 2022.
  41. Fast ε𝜀\varepsilonitalic_ε-free inference of simulation models with bayesian conditional density estimation. Advances in neural information processing systems, 29, 2016.
  42. pyABC. pyABC documentation, multi-scale model: Tumor spheroid growth. https://pyabc.readthedocs.io/en/latest/examples/ multiscale_agent_based.html, 2017. Accessed: 2023-12-07.
  43. Bayesflow: Learning complex stochastic models with invertible neural networks. IEEE transactions on neural networks and learning systems, 2020.
  44. Amortized bayesian model comparison with evidential deep learning. IEEE Transactions on Neural Networks and Learning Systems, 2021a.
  45. Outbreakflow: Model-based bayesian inference of disease outbreak dynamics with invertible neural networks and its application to the covid-19 pandemics in germany. PLoS computational biology, 2021b.
  46. JANA: Jointly Amortized Neural Approximation of Complex Bayesian Models. In Evans, R. J. and Shpitser, I. (eds.), Proceedings of the 39th Conference on Uncertainty in Artificial Intelligence, volume 216 of Proceedings of Machine Learning Research, pp.  1695–1706. PMLR, 2023a.
  47. Bayesflow: Amortized bayesian workflows with neural networks. Journal of Open Source Software, 8(89):5702, 2023b. doi: 10.21105/joss.05702. URL https://doi.org/10.21105/joss.05702.
  48. Gatsbi: Generative adversarial training for simulation-based inference. arXiv preprint arXiv:2203.06481, 2022.
  49. High-resolution image synthesis with latent diffusion models. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp.  10684–10695, 2022.
  50. U-net: Convolutional networks for biomedical image segmentation. In Navab, N., Hornegger, J., Wells, W. M., and Frangi, A. F. (eds.), Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015, pp.  234–241, Cham, 2015. Springer International Publishing. ISBN 978-3-319-24574-4.
  51. Leveraging self-consistency for data-efficient amortized Bayesian inference. In NeurIPS UniReps: the First Workshop on Unifying Representations in Neural Models, 2023a.
  52. Fuse it or lose it: Deep fusion for multimodal simulation-based inference, 2023b. arXiv:2311.10671.
  53. Sequential neural score estimation: Likelihood-free inference with conditional score based diffusion models, 2022.
  54. Shiono, T. Estimation of agent-based models using Bayesian deep learning approach of BayesFlow. Journal of Economic Dynamics and Control, 125:104082, 2021.
  55. Improved Techniques for Training Consistency Models, October 2023.
  56. Generative modeling by estimating gradients of the data distribution. In Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2019.
  57. Score-based generative modeling through stochastic differential equations. In International Conference on Learning Representations, 2021.
  58. Consistency models. In International Conference on Machine Learning, 2023.
  59. Validating bayesian inference algorithms with simulation-based calibration. arXiv preprint, 2018.
  60. Mental speed is high until age 60 as revealed by analysis of over a million participants. Nature Human Behaviour, 6(5):700–708, May 2022. doi: 10.1038/s41562-021-01282-7.
  61. Generating videos with scene dynamics. Advances in Neural Information Processing Systems, 29, 2016.
  62. Sequential neural posterior and likelihood approximation. arXiv preprint, 2021.
  63. Deep sets, 2017.
Citations (6)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 3 tweets with 63 likes about this paper.