Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Refining local-type primordial non-Gaussianity: Sharpened $b_φ$ constraints through bias expansion (2501.14873v2)

Published 24 Jan 2025 in astro-ph.CO and astro-ph.GA

Abstract: Local-type primordial non-Gaussianity (PNG), predicted by many non-minimal models of inflation, creates a scale-dependent contribution to the power spectrum of large-scale structure (LSS) tracers. Its amplitude is characterized by the product $b_\phi f_{\rm NL}{\rm loc}$, where $b_\phi$ is an astrophysical parameter dependent on the properties of the tracer. However, $b_\phi$ exhibits significant secondary dependence on halo concentration and other astrophysical properties, which may bias and weaken the constraints on $f_{\rm NL}{\rm loc}$. In this work, we demonstrate that incorporating knowledge of the relation between Lagrangian bias parameters and $b_\phi$ can significantly enhance PNG constraints. We employ the Hybrid Effective Field Theory (HEFT) approach at the field-level and a linear regression model to seek a connection between the bias parameters and $b_{\phi}$ for halo and galaxy samples, constructed using the \textsc{AbacusSummit} simulation suite and mimicking the luminous red galaxies (LRGs) and quasi-stellar objects (QSOs) of the Dark Energy Spectroscopic Instrument (DESI) survey. For the fixed-mass halo samples, our full bias model reduces the uncertainty by more than 70\%, with most of that improvement coming from $b_\nabla$, which we find to be an excellent proxy for concentration. For the galaxy samples, our model reduces the uncertainty on $b_\phi$ by 80\% for all tracers. By adopting Lagrangian-bias informed priors on the parameter $b_\phi$, future analyses can thus constrain $f_{\rm NL}{\rm loc}$ with less bias and smaller errors.

Summary

We haven't generated a summary for this paper yet.