Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Test of Artificial Neural Networks in Likelihood-free Cosmological Constraints: A Comparison of IMNN and DAE (2211.05064v4)

Published 9 Nov 2022 in astro-ph.CO

Abstract: In the procedure of constraining the cosmological parameters with the observational Hubble data and the type Ia supernova data, the combination of Masked Autoregressive Flow and Denoising Autoencoder can perform a good result. The above combination extracts the features from OHD with DAE, and estimates the posterior distribution of cosmological parameters with MAF. We ask whether we can find a better tool to compress large data in order to gain better results while constraining the cosmological parameters. Information maximising neural networks, a kind of simulation-based machine learning technique, was proposed at an earlier time. In a series of numerical examples, the results show that IMNN can find optimal, non-linear summaries robustly. In this work, we mainly compare the dimensionality reduction capabilities of IMNN and DAE. We use IMNN and DAE to compress the data into different dimensions and set different learning rates for MAF to calculate the posterior. Meanwhile, the training data and mock OHD are generated with a simple Gaussian likelihood, the spatially flat {\Lambda}CDM model and the real OHD data. To avoid the complex calculation in comparing the posterior directly, we set different criteria to compare IMNN and DAE.

Citations (1)

Summary

We haven't generated a summary for this paper yet.