Non-Gaussian forecasts of weak lensing with and without priors (1506.05356v1)
Abstract: Assuming a Euclid-like weak lensing data set, we compare different methods of dealing with its inherent parameter degeneracies. Including priors into a data analysis can mask the information content of a given data set alone. However, since the information content of a data set is usually estimated with the Fisher matrix, priors are added in order to enforce an approximately Gaussian likelihood. Here, we compare priorless forecasts to more conventional forecasts that use priors. We find strongly non-Gaussian likelihoods for 2d-weak lensing if no priors are used, which we approximate with the DALI-expansion. Without priors, the Fisher matrix of the 2d-weak lensing likelihood includes unphysical values of $\Omega_m$ and $h$, since it does not capture the shape of the likelihood well. The Cramer-Rao inequality then does not need to apply. We find that DALI and Monte Carlo Markov Chains predict the presence of a dark energy with high significance, whereas a Fisher forecast of the same data set also allows decelerated expansion. We also find that a 2d-weak lensing analysis provides a sharp lower limit on the Hubble constant of $h > 0.4$, even if the equation of state of dark energy is jointly constrained by the data. This is not predicted by the Fisher matrix and usually masked in other works by a sharp prior on $h$. Additionally, we find that DALI estimates Figures of Merit in the presence of non-Gaussianities better than the Fisher matrix. We additionally demonstrate how DALI allows switching to a Hamiltonian Monte Carlo sampling of a highly curved likelihood with acceptance rates of $\approx 0.5$, an effective covering of the parameter space, and numerically effectively costless leapfrog steps. This shows how quick forecasts can be upgraded to accurate forecasts whenever needed. Results were gained with the public code from http://lnasellentin.github.io/DALI/