Papers
Topics
Authors
Recent
2000 character limit reached

Structure-Preserving Transformers for Sequences of SPD Matrices (2309.07579v7)

Published 14 Sep 2023 in cs.LG and eess.SP

Abstract: In recent years, Transformer-based auto-attention mechanisms have been successfully applied to the analysis of a variety of context-reliant data types, from texts to images and beyond, including data from non-Euclidean geometries. In this paper, we present such a mechanism, designed to classify sequences of Symmetric Positive Definite matrices while preserving their Riemannian geometry throughout the analysis. We apply our method to automatic sleep staging on timeseries of EEG-derived covariance matrices from a standard dataset, obtaining high levels of stage-wise performance.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. F. Yger, M. Berar, and F. Lotte, “Riemannian approaches in brain-computer interfaces: A review,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 10, pp. 1753–1762, 2017.
  2. N. Hansen, S. D. Müller, and P. Koumoutsakos, “Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (cma-es),” Evolutionary Computation, vol. 11, no. 1, pp. 1–18, 2003.
  3. V. Arsigny, P. Fillard, X. Pennec, and N. Ayache, “Log-euclidean metrics for fast and simple calculus on diffusion tensors,” Magnetic Resonance in Medicine, vol. 56, no. 2, pp. 411–421, 2006.
  4. Z. Huang and L. Van Gool, “A riemannian network for spd matrix learning,” in Association for the Advancement of Artificial Intelligence (AAAI), 2017.
  5. R. Chakraborty, J. Bouza, J. H. Manton, and B. C. Vemuri, “Manifoldnet: A deep neural network for manifold-valued data with applications,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 2, pp. 799–810, 2022.
  6. X. Pennec, P. Fillard, and N. Ayache, “A riemannian framework for tensor computing,” International Journal of Computer Vision, vol. 66, no. 1, pp. 41–66, Jan 2006.
  7. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. u. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in Neural Information Processing Systems, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, Eds., vol. 30.   Curran Associates, Inc., 2017.
  8. D. Konstantinidis, I. Papastratis, K. Dimitropoulos, and P. Daras, “Multi-manifold attention for vision transformers,” IEEE Access, vol. 11, pp. 123 433–123 444, 2023.
  9. L. He, Y. Dong, Y. Wang, D. Tao, and Z. Lin, “Gauge equivariant transformer,” in Advances in Neural Information Processing Systems, M. Ranzato, A. Beygelzimer, Y. Dauphin, P. Liang, and J. W. Vaughan, Eds., vol. 34.   Curran Associates, Inc., 2021, pp. 27 331–27 343. [Online]. Available: https://proceedings.neurips.cc/paper_files/paper/2021/file/e57c6b956a6521b28495f2886ca0977a-Paper.pdf
  10. Z. Li, X. TANG, Z. Xu, X. Wang, H. Yu, M. Chen, and X. Wei, “Geodesic self-attention for 3d point clouds,” in Advances in Neural Information Processing Systems, A. H. Oh, A. Agarwal, D. Belgrave, and K. Cho, Eds., 2022. [Online]. Available: https://openreview.net/forum?id=2ndfW2bw4mi
  11. A. Kratsios, B. Zamanlooy, T. Liu, and I. Dokmanić, “Universal approximation under constraints is possible with transformers,” in International Conference on Learning Representations, 2022. [Online]. Available: https://openreview.net/forum?id=JGO8CvG5S9
  12. M. T. Harandi, M. Salzmann, and R. Hartley, “From manifold to manifold: Geometry-aware dimensionality reduction for spd matrices,” in Computer Vision – ECCV 2014, D. Fleet, T. Pajdla, B. Schiele, and T. Tuytelaars, Eds.   Cham: Springer International Publishing, 2014, pp. 17–32.
  13. R. B. Berry, R. Brooks, C. Gamaldo, S. M. Harding, R. M. Lloyd, S. F. Quan, M. T. Troester, and B. V. Vaughn, “Aasm scoring manual updates for 2017 (version 2.4),” Journal of Clinical Sleep Medicine, vol. 13, no. 05, pp. 665–666, 2017. [Online]. Available: https://jcsm.aasm.org/doi/abs/10.5664/jcsm.6576
  14. H. Phan and K. Mikkelsen, “Automatic sleep staging of eeg signals: recent development, challenges, and future directions,” Physiological Measurement, vol. 43, no. 4, p. 04TR01, apr 2022. [Online]. Available: https://dx.doi.org/10.1088/1361-6579/ac6049
  15. A. Supratak, H. Dong, C. Wu, and Y. Guo, “Deepsleepnet: a model for automatic sleep stage scoring based on raw single-channel eeg,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 11, pp. 1998–2008, Nov 2017.
  16. H. Seo, S. Back, S. Lee, D. Park, T. Kim, and K. Lee, “Intra- and inter-epoch temporal context network (iitnet) using sub-epoch features for automatic sleep scoring on raw single-channel eeg,” Biomedical Signal Processing and Control, vol. 61, p. 102037, 2020.
  17. H. Phan, O. Y. Chén, M. C. Tran, P. Koch, A. Mertins, and M. De Vos, “Xsleepnet: Multi-view sequential model for automatic sleep staging,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 9, pp. 5903–5915, 2022.
  18. H. Phan, K. P. Lorenzen, E. Heremans, O. Y. Chén, M. C. Tran, P. Koch, A. Mertins, M. Baumert, K. B. Mikkelsen, and M. De Vos, “L-seqsleepnet: Whole-cycle long sequence modelling for automatic sleep staging,” IEEE Journal of Biomedical and Health Informatics, pp. 1–10, 2023.
  19. A. Guillot and V. Thorey, “Robustsleepnet: Transfer learning for automated sleep staging at scale,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 29, pp. 1441–1451, 2021.
  20. T. Zhu, W. Luo, and F. Yu, “Convolution-and Attention-Based Neural Network for Automated Sleep Stage Classification,” Int J Environ Res Public Health, vol. 17, no. 11, Jun 2020.
  21. H. Phan, K. Mikkelsen, O. Y. Chén, P. Koch, A. Mertins, and M. De Vos, “Sleeptransformer: Automatic sleep staging with interpretability and uncertainty quantification,” IEEE Transactions on Biomedical Engineering, vol. 69, no. 8, pp. 2456–2467, 2022.
  22. M. Seraphim, P. Dequidt, A. Lechervy, F. Yger, L. Brun, and O. Etard, “Temporal sequences of eeg covariance matrices for automated sleep stage scoring with attention mechanisms,” in Computer Analysis of Images and Patterns, N. Tsapatsoulis, A. Lanitis, M. Pattichis, C. Pattichis, C. Kyrkou, E. Kyriacou, Z. Theodosiou, and A. Panayides, Eds.   Cham: Springer Nature Switzerland, 2023, pp. 67–76.
  23. Z. Jia, Y. Lin, J. Wang, R. Zhou, X. Ning, Y. He, and Y. Zhao, “Graphsleepnet: Adaptive spatial-temporal graph convolutional networks for sleep stage classification.” in IJCAI, 2020, pp. 1324–1330.
  24. P. Dequidt, M. Seraphim, A. Lechervy, I. I. Gaez, L. Brun, and O. Etard, “Automatic sleep stage classification on eeg signals using time-frequency representation,” in Artificial Intelligence in Medicine, J. M. Juarez, M. Marcos, G. Stiglic, and A. Tucker, Eds.   Cham: Springer Nature Switzerland, 2023, pp. 250–259.
  25. S. Eickhoff and V. Müller, “Functional connectivity,” in Brain Mapping, A. W. Toga, Ed.   Waltham: Academic Press, 2015, pp. 187–201.
  26. C. O’reilly, N. Gosselin, J. Carrier, and T. Nielsen, “Montreal archive of sleep studies: an open-access resource for instrument benchmarking and exploratory research,” Journal of sleep research, vol. 23, no. 6, pp. 628–635, 2014.
  27. T. Akiba, S. Sano, T. Yanase, T. Ohta, and M. Koyama, “Optuna: A next-generation hyperparameter optimization framework,” in Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, 2019, pp. 2623–2631.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.