Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 93 tok/s
Gemini 2.5 Pro 55 tok/s Pro
GPT-5 Medium 15 tok/s
GPT-5 High 20 tok/s Pro
GPT-4o 98 tok/s
GPT OSS 120B 460 tok/s Pro
Kimi K2 217 tok/s Pro
2000 character limit reached

Fast Parameter Inference on Pulsar Timing Arrays with Normalizing Flows (2310.12209v1)

Published 18 Oct 2023 in astro-ph.IM, astro-ph.HE, cs.LG, gr-qc, and hep-ph

Abstract: Pulsar timing arrays (PTAs) perform Bayesian posterior inference with expensive MCMC methods. Given a dataset of ~10-100 pulsars and O(103) timing residuals each, producing a posterior distribution for the stochastic gravitational wave background (SGWB) can take days to a week. The computational bottleneck arises because the likelihood evaluation required for MCMC is extremely costly when considering the dimensionality of the search space. Fortunately, generating simulated data is fast, so modern simulation-based inference techniques can be brought to bear on the problem. In this paper, we demonstrate how conditional normalizing flows trained on simulated data can be used for extremely fast and accurate estimation of the SGWB posteriors, reducing the sampling time from weeks to a matter of seconds.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. G. Agazie et al. (NANOGrav Collaboration), Astrophys. J. Lett. 951, L8 (2023a), arXiv:2306.16213 [astro-ph.HE] .
  2. J. Antoniadis et al. (EPTA Collaboration), “The second data release from the European Pulsar Timing Array III. Search for gravitational wave signals,”  (2023a), arXiv:2306.16214 [astro-ph.HE] .
  3. D. J. Reardon et al., Astrophys. J. Lett. 951, L6 (2023), arXiv:2306.16215 [astro-ph.HE] .
  4. H. Xu et al., Res. Astron. Astrophys. 23, 075024 (2023), arXiv:2306.16216 [astro-ph.HE] .
  5. G. Agazie et al. (NANOGrav Collaboration), Astrophys. J. Lett. 952, L37 (2023b), arXiv:2306.16220 [astro-ph.HE] .
  6. J. Antoniadis et al. (EPTA Collaboration), “The second data release from the European Pulsar Timing Array: V. Implications for massive black holes, dark matter and the early Universe,”  (2023b), arXiv:2306.16227 [astro-ph.CO] .
  7. A. Afzal et al. (NANOGrav Collaboration), Astrophys. J. Lett. 951, L11 (2023), arXiv:2306.16219 [astro-ph.HE] .
  8. N. S. Pol et al. (NANOGrav), Astrophys. J. Lett. 911, L34 (2021), arXiv:2010.11950 [astro-ph.HE] .
  9. R. W. Hellings and G. S. Downs, ApJ  265, L39 (1983).
  10. M. F. Alam et al. (NANOGrav Collaboration), Astrophys. J. Suppl. 252, 4 (2021), arXiv:2005.06490 [astro-ph.HE] .
  11. K. Cranmer and J. Lo, “Simulation-based inference,” https://simulation-based-inference.org/.
  12. S. Mishra-Sharma, “Awesome neural SBI,” https://github.com/smsharma/awesome-neural-sbi.
  13. G. Papamakarios, E. Nalisnick, D. J. Rezende, S. Mohamed,  and B. Lakshminarayanan, “Normalizing flows for probabilistic modeling and inference,”  (2021), arXiv:1912.02762 [stat.ML] .
  14. G. Papamakarios and I. Murray, “Fast ϵitalic-ϵ\epsilonitalic_ϵ-free inference of simulation models with bayesian conditional density estimation,”  (2018), arXiv:1605.06376 [stat.ML] .
  15. W. DeRocco and J. A. Dror,   (2023), arXiv:2304.13042 [astro-ph.HE] .
  16. E. S. Phinney, arXiv e-prints , astro-ph/0108028 (2001), arXiv:astro-ph/0108028 [astro-ph] .
  17. M. Vallisneri, “libstempo: Python wrapper for Tempo2,” Astrophysics Source Code Library, record ascl:2002.017 (2020), ascl:2002.017 .
  18. Z. Arzoumanian et al. (NANOGrav Collaboration), Astrophys. J. Lett. 905, L34 (2020), arXiv:2009.04496 [astro-ph.HE] .
  19. R. van Haasteren and Y. Levin, Mon. Not. Roy. Astron. Soc. 428, 1147 (2013), arXiv:1202.5932 [astro-ph.IM] .
  20. G. Papamakarios, T. Pavlakou,  and I. Murray, “Masked autoregressive flow for density estimation,”  (2018), arXiv:1705.07057 [stat.ML] .
  21. C. Durkan, A. Bekasov, I. Murray,  and G. Papamakarios, “Neural spline flows,”  (2019), arXiv:1906.04032 [stat.ML] .
  22. J. A. Gregory and R. Delbourgo, IMA Journal of Numerical Analysis 2, 123 (1982), https://academic.oup.com/imajna/article-pdf/2/2/123/2267745/2-2-123.pdf .
  23. M. Germain, K. Gregor, I. Murray,  and H. Larochelle, “Made: Masked autoencoder for distribution estimation,”  (2015), arXiv:1502.03509 [cs.LG] .
  24. H. Sak, A. Senior,  and F. Beaufays, “Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition,”  (2014), arXiv:1402.1128 [cs.NE] .
  25. C. Durkan, A. Bekasov, I. Murray,  and G. Papamakarios, “nflows: normalizing flows in pytorch,”  (2020).
  26. A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Köpf, E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai,  and S. Chintala, “Pytorch: An imperative style, high-performance deep learning library,” in Proceedings of the 33rd International Conference on Neural Information Processing Systems (Curran Associates Inc., Red Hook, NY, USA, 2019).
  27. L. Liu, H. Jiang, P. He, W. Chen, X. Liu, J. Gao,  and J. Han, “On the variance of the adaptive learning rate and beyond,”  (2021), arXiv:1908.03265 [cs.LG] .
  28. S. R. Hinton, The Journal of Open Source Software 1, 00045 (2016).
  29. J. A. Ellis, M. Vallisneri, S. R. Taylor,  and P. T. Baker, “ENTERPRISE: Enhanced Numerical Toolbox Enabling a Robust PulsaR Inference SuitE,”  (2020).
  30. E. Hellinger, Journal für die reine und angewandte Mathematik 136, 210 (1909).
  31. A. Kong, A note on importance sampling using standardized weights, Tech. Rep. 348 (University of Chicago, Dept. of Statistics, 1992).
  32. J. Sohl-Dickstein, E. A. Weiss, N. Maheswaranathan,  and S. Ganguli, “Deep unsupervised learning using nonequilibrium thermodynamics,”  (2015), arXiv:1503.03585 [cs.LG] .
  33. Y. Song and S. Ermon, “Generative modeling by estimating gradients of the data distribution,”  (2020a), arXiv:1907.05600 [cs.LG] .
  34. Y. Song and S. Ermon, “Improved Techniques for Training Score-Based Generative Models,”  (2020b), arXiv:2006.09011 [cs.LG] .
  35. J. Ho, A. Jain,  and P. Abbeel, “Denoising diffusion probabilistic models,”  (2020), arXiv:2006.11239 [cs.LG] .
  36. R. T. Q. Chen, Y. Rubanova, J. Bettencourt,  and D. Duvenaud, “Neural ordinary differential equations,”  (2019), arXiv:1806.07366 [cs.LG] .
  37. J. Luo, S. Ransom, P. Demorest, R. van Haasteren, P. Ray, K. Stovall, M. Bachetti, A. Archibald, M. Kerr, J. Colen,  and F. Jenet, “PINT: High-precision pulsar timing analysis package,” Astrophysics Source Code Library, record ascl:1902.007 (2019), ascl:1902.007 .
Citations (4)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.