Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Advancing Cosmological Parameter Estimation and Hubble Parameter Reconstruction with Long Short-Term Memory and Efficient-Kolmogorov-Arnold Networks (2504.00392v1)

Published 1 Apr 2025 in astro-ph.CO

Abstract: In this work, we propose a novel approach for cosmological parameter estimation and Hubble parameter reconstruction using Long Short-Term Memory (LSTM) networks and Efficient-Kolmogorov-Arnold Networks (Ef-KAN). LSTM networks are employed to extract features from observational data, enabling accurate parameter inference and posterior distribution estimation without relying on solvable likelihood functions. This method achieves performance comparable to traditional Markov Chain Monte Carlo (MCMC) techniques, offering a computationally efficient alternative for high-dimensional parameter spaces. By sampling from the reconstructed data and comparing it with mock data, our designed LSTM constraint procedure demonstrates the superior performance of this method in terms of constraint accuracy, and effectively captures the degeneracies and correlations between the cosmological parameters. Additionally, the Ef-KAN model is introduced to reconstruct the Hubble parameter H(z) from both observational and mock data. Ef-KAN is entirely data-driven approach, free from prior assumptions, and demonstrates superior capability in modeling complex, non-linear data distributions. We validate the Ef-KAN method by reconstructing the Hubble parameter, demonstrating that H(z) can be reconstructed with high accuracy. By combining LSTM and Ef-KAN, we provide a robust framework for cosmological parameter inference and Hubble parameter reconstruction, paving the way for future research in cosmology, especially when dealing with complex datasets and high-dimensional parameter spaces.

Summary

We haven't generated a summary for this paper yet.