Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Constraining cosmological parameters from N-body simulations with Variational Bayesian Neural Networks (2301.03991v1)

Published 9 Jan 2023 in astro-ph.IM, astro-ph.CO, cs.AI, and cs.LG

Abstract: Methods based on Deep Learning have recently been applied on astrophysical parameter recovery thanks to their ability to capture information from complex data. One of these methods is the approximate Bayesian Neural Networks (BNNs) which have demonstrated to yield consistent posterior distribution into the parameter space, helpful for uncertainty quantification. However, as any modern neural networks, they tend to produce overly confident uncertainty estimates and can introduce bias when BNNs are applied to data. In this work, we implement multiplicative normalizing flows (MNFs), a family of approximate posteriors for the parameters of BNNs with the purpose of enhancing the flexibility of the variational posterior distribution, to extract $\Omega_m$, $h$, and $\sigma_8$ from the QUIJOTE simulations. We have compared this method with respect to the standard BNNs, and the flipout estimator. We found that MNFs combined with BNNs outperform the other models obtaining predictive performance with almost one order of magnitude larger that standard BNNs, $\sigma_8$ extracted with high accuracy ($r2=0.99$), and precise uncertainty estimates. The latter implies that MNFs provide more realistic predictive distribution closer to the true posterior mitigating the bias introduced by the variational approximation and allowing to work with well-calibrated networks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (42)
  1. Stefano B, Kravtsov A. Cosmological simulations of galaxy clusters. Advanced Science Letters 4 (2011) 204–227. 10.1166/asl.2011.1209.
  2. Dodelson S. Modern Cosmology (Academic Press, Elsevier Science) (2003).
  3. Planck 2018 results. Astronomy and Astrophysics 641 (2020) A6. 10.1051/0004-6361/201833910.
  4. COSMOLOGICAL CONSTRAINTS FROM GALAXY CLUSTERING AND THE MASS-TO-NUMBER RATIO OF GALAXY CLUSTERS. The Astrophysical Journal 745 (2011) 16. 10.1088/0004-637x/745/1/16.
  5. Yusofi E, Ramzanpour MA. Cosmological constant problem and h0subscriptℎ0h_{0}italic_h start_POSTSUBSCRIPT 0 end_POSTSUBSCRIPT tension in void-dominated cosmology (2022). 10.48550/ARXIV.2204.12180.
  6. 21cmfast: a fast, seminumerical simulation of the high-redshift 21-cm signal. Monthly Notices of the Royal Astronomical Society 411 (2011) 955–972. 10.1111/j.1365-2966.2010.17731.x.
  7. Cosmological parameters from large scale structure - geometric versus shape information. Journal of Cosmology and Astroparticle Physics 2010 (2010) 022–022. 10.1088/1475-7516/2010/07/022.
  8. Cosmology intertwined: A review of the particle physics, astrophysics, and cosmology associated with the cosmological tensions and anomalies. Journal of High Energy Astrophysics (2022).
  9. Deep learning from 21-cm tomography of the cosmic dawn and reionization. Monthly Notices of the Royal Astronomical Society 484 (2019) 282–293. 10.1093/mnras/stz010.
  10. Machine learning and cosmology (2022). 10.48550/ARXIV.2203.08056.
  11. On calibration of modern neural networks. Proceedings of the 34th International Conference on Machine Learning - Volume 70 (JMLR.org) (2017), ICML 17, 1321–1330.
  12. Chang DT. Bayesian neural networks: Essentials (2021). 10.48550/ARXIV.2106.13594.
  13. Estimating cosmological parameters from the dark matter distribution (2017). 10.48550/ARXIV.1711.02033.
  14. Lazanu A. Extracting cosmological parameters from n-body simulations using machine learning techniques. Journal of Cosmology and Astroparticle Physics 2021 (2021) 039. 10.1088/1475-7516/2021/09/039.
  15. Machine learning cosmology from void properties (2022). 10.48550/ARXIV.2212.06860.
  16. Parameter estimation for the cosmic microwave background with bayesian neural networks. Physical Review D 102 (2020a). 10.1103/physrevd.102.103509.
  17. Constraining the reionization history using bayesian normalizing flows. Machine Learning: Science and Technology 1 (2020b) 035014. 10.1088/2632-2153/aba6f1.
  18. Hortua HJ. Constraining cosmological parameters from n-body simulations with bayesian neural networks (2021). 10.48550/ARXIV.2112.11865.
  19. Seeking new physics in cosmology with bayesian neural networks: Dark energy and modified gravity. Phys. Rev. D 105 (2022) 023531. 10.1103/PhysRevD.105.023531.
  20. Galactic center excess in a new light: Disentangling the gamma-ray sky with bayesian graph convolutional neural networks. Physical Review Letters 125 (2020). 10.1103/physrevlett.125.241102.
  21. Hierarchical inference with bayesian neural networks: An application to strong gravitational lensing. The Astrophysical Journal 909 (2021) 187. 10.3847/1538-4357/abdf59.
  22. Graves A, editor. Practical Variational Inference for Neural Networks, vol. 24 (Curran Associates, Inc.) (2011).
  23. Bayesian neural networks (2020). 10.48550/ARXIV.2006.01490.
  24. The quijote simulations. The Astrophysical Journal Supplement Series 250 (2020) 2. 10.3847/1538-4365/ab9d82.
  25. A review of uncertainty quantification in deep learning: Techniques, applications and challenges. Information Fusion 76 (2021) 243–297. 10.1016/j.inffus.2021.05.008.
  26. Gal Y. Uncertainty in Deep Learning. Ph.D. thesis, University of Cambridge (2016).
  27. Flipout: Efficient pseudo-independent weight perturbations on mini-batches (2018). 10.48550/ARXIV.1803.04386.
  28. Kiureghian AD, Ditlevsen O. Aleatory or epistemic? does it matter? Structural Safety 31 (2009) 105 – 112. https://doi.org/10.1016/j.strusafe.2008.06.020. Risk Acceptance and Risk Communication.
  29. Kendall A, Gal Y. What uncertainties do we need in bayesian deep learning for computer vision? (2017).
  30. Ininternational conference on medical imaging with deep learning (2018) 13.
  31. Well-calibrated regression uncertainty in medical imaging with deep learning. Medical Imaging with Deep Learning (2020).
  32. Louizos C, Welling M. Multiplicative normalizing flows for variational bayesian neural networks. Proceedings of the 34th International Conference on Machine Learning - Volume 70 (JMLR.org) (2017), ICML’17, 2218–2227.
  33. Hierarchical variational models. Proceedings of the 33rd International Conference on International Conference on Machine Learning - Volume 48 (JMLR.org) (2016), ICML’16, 2568–2577.
  34. Randomized value functions via multiplicative normalizing flows (2018). 10.48550/ARXIV.1806.02315.
  35. Density estimation using real NVP. International Conference on Learning Representations (2017).
  36. Springel V. The cosmological simulation code gadget-2. Monthly Notices of the Royal Astronomical Society 364 (2005) 1105–1134. 10.1111/j.1365-2966.2005.09655.x.
  37. Scoccimarro R. Transients from initial conditions: a perturbative analysis. Monthly Notices of the Royal Astronomical Society 299 (1998) 1097–1118. 10.1046/j.1365-8711.1998.01845.x.
  38. TensorFlow: Large-scale machine learning on heterogeneous systems (2015). Software available from tensorflow.org.
  39. 3d convolutional neural networks for stalled brain capillary detection. Computers in Biology and Medicine 141 (2022) 105089. 10.1016/j.compbiomed.2021.105089.
  40. Kingma DP, Ba J. Adam: A method for stochastic optimization (2014). 10.48550/ARXIV.1412.6980.
  41. Ladder variational autoencoders. Proceedings of the 30th International Conference on Neural Information Processing Systems (Red Hook, NY, USA: Curran Associates Inc.) (2016), NIPS’16, 3745–3753.
  42. Lewis A. GetDist: a Python package for analysing Monte Carlo samples (2019).
Citations (4)

Summary

We haven't generated a summary for this paper yet.