Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Temporal Stamp Classifier: Classifying Short Sequences of Astronomical Alerts (2405.15073v1)

Published 23 May 2024 in astro-ph.IM and cs.AI

Abstract: In this work, we propose a deep learning-based classification model of astronomical objects using alerts reported by the Zwicky Transient Facility (ZTF) survey. The model takes as inputs sequences of stamp images and metadata contained in each alert, as well as features from the All-WISE catalog. The proposed model, called temporal stamp classifier, is able to discriminate between three classes of astronomical objects: Active Galactic Nuclei (AGN), Super-Novae (SNe) and Variable Stars (VS), with an accuracy of approximately 98% in the test set, when using 2 to 5 detections. The results show that the model performance improves with the addition of more detections. Simple recurrence models obtain competitive results with those of more complex models such as LSTM.We also propose changes to the original stamp classifier model, which only uses the first detection. The performance of the latter model improves with changes in the architecture and the addition of random rotations, achieving a 1.46% increase in test accuracy.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)
  1. R. Carrasco-Davis et al., “Alert classification for the alerce broker system: The real-time stamp classifier,” The Astronomical Journal, vol. 162, p. 231, nov 2021.
  2. P. Sánchez-Sáez et al., “Alert Classification for the ALeRCE Broker System: The Light Curve Classifier,” The Astronomical Journal, vol. 161, p. 141, Mar. 2021.
  3. F. J. Masci et al., “The zwicky transient facility: Data processing, products, and archive,” Publications of the Astronomical Society of the Pacific, vol. 131, p. 018003, dec 2018.
  4. E. J. G. Pitman, “Significance tests which may be applied to samples from any populations,” Supplement to the Journal of the Royal Statistical Society, vol. 4, no. 1, pp. 119–130, 1937.
  5. E. L. Wright et al., “The Wide-field Infrared Survey Explorer (WISE): Mission Description and Initial On-orbit Performance,” The Astronomical Journal, vol. 140, pp. 1868–1881, Dec. 2010.
  6. S. Hochreiter et al., “Long short-term memory,” Neural computation, vol. 9, pp. 1735–80, dec 1997.
  7. K. Cho et al., “Learning phrase representations using RNN encoder–decoder for statistical machine translation,” in Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) (A. Moschitti, B. Pang, and W. Daelemans, eds.), (Doha, Qatar), pp. 1724–1734, Association for Computational Linguistics, Oct. 2014.
  8. B. de Vries et al., “The gamma model—a new neural model for temporal processing,” Neural Networks, vol. 5, no. 4, pp. 565–576, 1992.
  9. K. Fukushima, “Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position,” Biological Cybernetics, vol. 36, pp. 193–202, 1980.
  10. A. Waibel et al., “Phoneme recognition using time-delay neural networks,” IEEE Transactions on Acoustics, Speech, and Signal Processing, vol. 37, no. 3, pp. 328–339, 1989.
  11. ALeRCE, “Alerce client,” 2020. API ALeRCE documentation, https://alerce.readthedocs.io/en/latest/.
  12. O. Pimentel et al., “Deep attention-based supernovae classification of multiband light curves,” The Astronomical Journal, vol. 165, p. 18, jan 2023.
  13. M. Perez-Carrasco et al., “Alert classification for the alerce broker system: The anomaly detector,” The Astronomical Journal, vol. 166, p. 151, sep 2023.
  14. P. Sánchez-Sáez et al., “Searching for changing-state agns in massive data sets. i. applying deep learning and anomaly-detection techniques to find agns with anomalous variability behaviors,” The Astronomical Journal, vol. 162, p. 206, oct 2021.
  15. R. Singh et al., “A Physics inspired Functional Operator for Model Uncertainty Quantification in the RKHS,” arXiv e-prints, p. arXiv:2109.10888, Sept. 2021.
  16. A. Dosovitskiy et al., “An image is worth 16x16 words: Transformers for image recognition at scale,” in 9th International Conference on Learning Representations, ICLR, OpenReview.net, 2021.
  17. A. Vaswani et al., “Attention is all you need,” in Advances in Neural Information Processing Systems (I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, eds.), vol. 30, Curran Associates, Inc., 2017.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com