Fast Parameter Inference on Pulsar Timing Arrays with Normalizing Flows (2310.12209v1)
Abstract: Pulsar timing arrays (PTAs) perform Bayesian posterior inference with expensive MCMC methods. Given a dataset of ~10-100 pulsars and O(103) timing residuals each, producing a posterior distribution for the stochastic gravitational wave background (SGWB) can take days to a week. The computational bottleneck arises because the likelihood evaluation required for MCMC is extremely costly when considering the dimensionality of the search space. Fortunately, generating simulated data is fast, so modern simulation-based inference techniques can be brought to bear on the problem. In this paper, we demonstrate how conditional normalizing flows trained on simulated data can be used for extremely fast and accurate estimation of the SGWB posteriors, reducing the sampling time from weeks to a matter of seconds.
- G. Agazie et al. (NANOGrav Collaboration), Astrophys. J. Lett. 951, L8 (2023a), arXiv:2306.16213 [astro-ph.HE] .
- J. Antoniadis et al. (EPTA Collaboration), “The second data release from the European Pulsar Timing Array III. Search for gravitational wave signals,” (2023a), arXiv:2306.16214 [astro-ph.HE] .
- D. J. Reardon et al., Astrophys. J. Lett. 951, L6 (2023), arXiv:2306.16215 [astro-ph.HE] .
- H. Xu et al., Res. Astron. Astrophys. 23, 075024 (2023), arXiv:2306.16216 [astro-ph.HE] .
- G. Agazie et al. (NANOGrav Collaboration), Astrophys. J. Lett. 952, L37 (2023b), arXiv:2306.16220 [astro-ph.HE] .
- J. Antoniadis et al. (EPTA Collaboration), “The second data release from the European Pulsar Timing Array: V. Implications for massive black holes, dark matter and the early Universe,” (2023b), arXiv:2306.16227 [astro-ph.CO] .
- A. Afzal et al. (NANOGrav Collaboration), Astrophys. J. Lett. 951, L11 (2023), arXiv:2306.16219 [astro-ph.HE] .
- N. S. Pol et al. (NANOGrav), Astrophys. J. Lett. 911, L34 (2021), arXiv:2010.11950 [astro-ph.HE] .
- R. W. Hellings and G. S. Downs, ApJ 265, L39 (1983).
- M. F. Alam et al. (NANOGrav Collaboration), Astrophys. J. Suppl. 252, 4 (2021), arXiv:2005.06490 [astro-ph.HE] .
- K. Cranmer and J. Lo, “Simulation-based inference,” https://simulation-based-inference.org/.
- S. Mishra-Sharma, “Awesome neural SBI,” https://github.com/smsharma/awesome-neural-sbi.
- G. Papamakarios, E. Nalisnick, D. J. Rezende, S. Mohamed, and B. Lakshminarayanan, “Normalizing flows for probabilistic modeling and inference,” (2021), arXiv:1912.02762 [stat.ML] .
- G. Papamakarios and I. Murray, “Fast ϵitalic-ϵ\epsilonitalic_ϵ-free inference of simulation models with bayesian conditional density estimation,” (2018), arXiv:1605.06376 [stat.ML] .
- W. DeRocco and J. A. Dror, (2023), arXiv:2304.13042 [astro-ph.HE] .
- E. S. Phinney, arXiv e-prints , astro-ph/0108028 (2001), arXiv:astro-ph/0108028 [astro-ph] .
- M. Vallisneri, “libstempo: Python wrapper for Tempo2,” Astrophysics Source Code Library, record ascl:2002.017 (2020), ascl:2002.017 .
- Z. Arzoumanian et al. (NANOGrav Collaboration), Astrophys. J. Lett. 905, L34 (2020), arXiv:2009.04496 [astro-ph.HE] .
- R. van Haasteren and Y. Levin, Mon. Not. Roy. Astron. Soc. 428, 1147 (2013), arXiv:1202.5932 [astro-ph.IM] .
- G. Papamakarios, T. Pavlakou, and I. Murray, “Masked autoregressive flow for density estimation,” (2018), arXiv:1705.07057 [stat.ML] .
- C. Durkan, A. Bekasov, I. Murray, and G. Papamakarios, “Neural spline flows,” (2019), arXiv:1906.04032 [stat.ML] .
- J. A. Gregory and R. Delbourgo, IMA Journal of Numerical Analysis 2, 123 (1982), https://academic.oup.com/imajna/article-pdf/2/2/123/2267745/2-2-123.pdf .
- M. Germain, K. Gregor, I. Murray, and H. Larochelle, “Made: Masked autoencoder for distribution estimation,” (2015), arXiv:1502.03509 [cs.LG] .
- H. Sak, A. Senior, and F. Beaufays, “Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition,” (2014), arXiv:1402.1128 [cs.NE] .
- C. Durkan, A. Bekasov, I. Murray, and G. Papamakarios, “nflows: normalizing flows in pytorch,” (2020).
- A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Köpf, E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai, and S. Chintala, “Pytorch: An imperative style, high-performance deep learning library,” in Proceedings of the 33rd International Conference on Neural Information Processing Systems (Curran Associates Inc., Red Hook, NY, USA, 2019).
- L. Liu, H. Jiang, P. He, W. Chen, X. Liu, J. Gao, and J. Han, “On the variance of the adaptive learning rate and beyond,” (2021), arXiv:1908.03265 [cs.LG] .
- S. R. Hinton, The Journal of Open Source Software 1, 00045 (2016).
- J. A. Ellis, M. Vallisneri, S. R. Taylor, and P. T. Baker, “ENTERPRISE: Enhanced Numerical Toolbox Enabling a Robust PulsaR Inference SuitE,” (2020).
- E. Hellinger, Journal für die reine und angewandte Mathematik 136, 210 (1909).
- A. Kong, A note on importance sampling using standardized weights, Tech. Rep. 348 (University of Chicago, Dept. of Statistics, 1992).
- J. Sohl-Dickstein, E. A. Weiss, N. Maheswaranathan, and S. Ganguli, “Deep unsupervised learning using nonequilibrium thermodynamics,” (2015), arXiv:1503.03585 [cs.LG] .
- Y. Song and S. Ermon, “Generative modeling by estimating gradients of the data distribution,” (2020a), arXiv:1907.05600 [cs.LG] .
- Y. Song and S. Ermon, “Improved Techniques for Training Score-Based Generative Models,” (2020b), arXiv:2006.09011 [cs.LG] .
- J. Ho, A. Jain, and P. Abbeel, “Denoising diffusion probabilistic models,” (2020), arXiv:2006.11239 [cs.LG] .
- R. T. Q. Chen, Y. Rubanova, J. Bettencourt, and D. Duvenaud, “Neural ordinary differential equations,” (2019), arXiv:1806.07366 [cs.LG] .
- J. Luo, S. Ransom, P. Demorest, R. van Haasteren, P. Ray, K. Stovall, M. Bachetti, A. Archibald, M. Kerr, J. Colen, and F. Jenet, “PINT: High-precision pulsar timing analysis package,” Astrophysics Source Code Library, record ascl:1902.007 (2019), ascl:1902.007 .
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.