Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
131 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Propagating Uncertainties in the SALT3 Model Training Process to Cosmological Constraints (2212.06879v2)

Published 13 Dec 2022 in astro-ph.CO and astro-ph.IM

Abstract: Type Ia supernovae (SNe Ia) are standardizable candles that must be modeled empirically to yield cosmological constraints. To understand the robustness of this modeling to variations in the model training procedure, we build an end-to-end pipeline to test the recently developed SALT3 model. We explore the consequences of removing pre-2000s low-$z$ or poorly calibrated $U$-band data, adjusting the amount and fidelity of SN Ia spectra, and using a model-independent framework to simulate the training data. We find the SALT3 model surfaces are improved by having additional spectra and $U$-band data, and can be shifted by $\sim 5\%$ if host galaxy contamination is not sufficiently removed from SN spectra. We find that resulting measurements of $w$ are consistent to within $2.5\%$ for all training variants explored in this work, with the largest shifts coming from variants that add color-dependent calibration offsets or host galaxy contamination to the training spectra, and those that remove pre-2000s low-$z$ data. These results demonstrate that the SALT3 model training procedure is largely robust to reasonable variations in the training data, but that additional attention must be paid to the treatment of spectroscopic data in the training process. We also find that the training procedure is sensitive to the color distributions of the input data; the resulting $w$ measurement can be biased by $\sim2\%$ if the color distribution is not sufficiently wide. Future low-$z$ data, particularly $u$-band observations and high signal-to-noise ratio SN Ia spectra, will help to significantly improve SN Ia modeling in the coming years.

Citations (1)

Summary

We haven't generated a summary for this paper yet.