Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 99 tok/s
Gemini 2.5 Pro 55 tok/s Pro
GPT-5 Medium 23 tok/s
GPT-5 High 19 tok/s Pro
GPT-4o 108 tok/s
GPT OSS 120B 465 tok/s Pro
Kimi K2 179 tok/s Pro
2000 character limit reached

Scale-dependent Bias from the Reconstruction of Non-Gaussian Distributions (1012.1859v2)

Published 8 Dec 2010 in astro-ph.CO

Abstract: Primordial non-Gaussianity introduces a scale-dependent variation in the clustering of density peaks corresponding to rare objects. This variation, parametrized by the bias, is investigated on scales where a linear perturbation theory is sufficiently accurate. The bias is obtained directly in real space by comparing the one- and two-point probability distributions of density fluctuations. We show that these distributions can be reconstructed using a bivariate Edgeworth series, presented here up to an arbitrarily high order. The Edgeworth formalism is shown to be well-suited for 'local' cubic-order non-Gaussianity parametrized by g_NL. We show that a strong scale-dependence in the bias can be produced by g_NL of order 10,000, consistent with CMB constraints. On correlation length of ~100 Mpc, current constraints on g_NL still allow the bias for the most massive clusters to be enhanced by 20-30% of the Gaussian value. We further examine the bias as a function of mass scale, and also explore the relationship between the clustering and the abundance of massive clusters in the presence of g_NL. We explain why the Edgeworth formalism, though technically challenging, is a very powerful technique for constraining high-order non-Gaussianity with large-scale structures.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.