Papers
Topics
Authors
Recent
2000 character limit reached

Data-Efficient Sleep Staging with Synthetic Time Series Pretraining (2403.08592v1)

Published 13 Mar 2024 in cs.LG and q-bio.QM

Abstract: Analyzing electroencephalographic (EEG) time series can be challenging, especially with deep neural networks, due to the large variability among human subjects and often small datasets. To address these challenges, various strategies, such as self-supervised learning, have been suggested, but they typically rely on extensive empirical datasets. Inspired by recent advances in computer vision, we propose a pretraining task termed "frequency pretraining" to pretrain a neural network for sleep staging by predicting the frequency content of randomly generated synthetic time series. Our experiments demonstrate that our method surpasses fully supervised learning in scenarios with limited data and few subjects, and matches its performance in regimes with many subjects. Furthermore, our results underline the relevance of frequency information for sleep stage scoring, while also demonstrating that deep neural networks utilize information beyond frequencies to enhance sleep staging performance, which is consistent with previous research. We anticipate that our approach will be advantageous across a broad spectrum of applications where EEG data is limited or derived from a small number of subjects, including the domain of brain-computer interfaces.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. Roy, Y. et al. Deep learning-based electroencephalography analysis: A systematic review. \JournalTitleJ. Neural Eng. 16, 051001, DOI: 10.1088/1741-2552/ab260c (2019).
  2. A survey on deep learning-based short/zero-calibration approaches for EEG-based brain–computer interfaces. \JournalTitleFront. Hum. Neurosci. 15, DOI: 10.3389/fnhum.2021.643386 (2021).
  3. Automatic sleep staging of EEG signals: Recent development, challenges, and future directions. \JournalTitlePhysiol. Meas. 43, 04TR01, DOI: 10.1088/1361-6579/ac6049 (2022).
  4. Fiorillo, L. et al. Automated sleep scoring: A review of the latest approaches. \JournalTitleSleep Med. Rev. 48, 101204, DOI: 10.1016/j.smrv.2019.07.007 (2019).
  5. Geometric deep learning: Grids, groups, graphs, geodesics, and gauges. \JournalTitleCoRR abs/2104.13478, DOI: 10.48550/arxiv.2104.13478 (2021).
  6. Deep Learning. Adaptive computation and machine learning (MIT Press, Cambridge, Massachusetts, 2016).
  7. Data augmentation for deep-learning-based electroencephalography. \JournalTitleJ. Neurosci. Meth. 346, 108885, DOI: 10.1016/j.jneumeth.2020.108885 (2020).
  8. Data augmentation for deep neural networks model in EEG classification task: A review. \JournalTitleFront. Hum. Neurosci. 15, DOI: 10.3389/fnhum.2021.765525 (2021).
  9. Transfer learning for non-image data in clinical research: A scoping review. \JournalTitlePLOS Digital Health 1, e0000014, DOI: 10.1371/journal.pdig.0000014 (2022).
  10. Pre-training in medical data: A survey. \JournalTitleMach. Intell. Res. 20, 147–179, DOI: 10.1007/s11633-022-1382-8 (2023).
  11. Self-supervised contrastive learning for medical time series: A systematic review. \JournalTitleSensors 23, DOI: 10.3390/s23094221 (2023).
  12. Uncovering the structure of clinical EEG signals with self-supervised learning. \JournalTitleJ. Neural Eng. 18, 046020, DOI: 10.1088/1741-2552/abca18 (2021).
  13. Generative adversarial networks in EEG analysis: An overview. \JournalTitleJ. NeuroEng. Rehabil. 20, DOI: 10.1186/s12984-023-01169-w (2023).
  14. Generation of synthetic EEG data for training algorithms supporting the diagnosis of major depressive disorder. \JournalTitleFront. Neurosci. 17, DOI: 10.3389/fnins.2023.1219133 (2023).
  15. Learning to see by looking at noise. In Ranzato, M., Beygelzimer, A., Dauphin, Y. N., Liang, P. & Vaughan, J. W. (eds.) Annu. Conf. on Neural Information Processing Systems, NeurIPS, 2556–2569 (virtual, 2021).
  16. Kataoka, H. et al. Pre-training without natural images. \JournalTitleInt. J. Comput. Vis. 130, 990–1007, DOI: 10.1007/S11263-021-01555-8 (2022).
  17. Berry, R. B. et al. The AASM Manual for the Scoring of Sleep and Associated Events: Rules, Terminology and Technical Specifications, Version 2.6 (American Academy of Sleep Medicine, Darien, Illinois, 2020).
  18. Signal processing techniques applied to human sleep EEG signals - A review. \JournalTitleBiomed. Signal Process. Control. 10, 21–33, DOI: 10.1016/J.BSPC.2013.12.003 (2014).
  19. Grieger, N. Source code of the model presented in Grieger et al., “Data-efficient sleep staging with synthetic time series pretraining”. https://github.com/dslaborg/frequency-pretraining (2024).
  20. Dreem open datasets: Multi-scored sleep datasets to compare human and automated sleep staging. \JournalTitleIEEE T. Neur. Sys. Reh. 28, 1955–1965, DOI: 10.1109/tnsre.2020.3011181 (2020).
  21. TinySleepNet: An efficient deep learning model for sleep stage scoring based on raw single-channel EEG. In 42nd Annual Int. Conf. of the IEEE Engineering in Medicine & Biology Society, EMBC 2020, 641–644, DOI: 10.1109/EMBC44109.2020.9176741 (IEEE, Montreal, QC, Canada, 2020).
  22. Long short-term memory. \JournalTitleNeural Comput. 9, 1735–1780, DOI: 10.1162/neco.1997.9.8.1735 (1997).
  23. Adam: A method for stochastic optimization. In Bengio, Y. & LeCun, Y. (eds.) 3rd Int. Conf. Learning Representations, ICLR, DOI: 10.48550/arxiv.1412.6980 (San Diego, CA, USA, 2015).
  24. A systematic analysis of performance measures for classification tasks. \JournalTitleInf. Process. Manag. 45, 427–437, DOI: 10.1016/J.IPM.2009.03.002 (2009).
  25. Tharwat, A. Classification assessment methods. \JournalTitleAppl. Comput. Inform. DOI: 10.1016/j.aci.2018.08.003 (2018).
  26. Delving deep into rectifiers: Surpassing human-level performance on ImageNet classification. In 2015 IEEE Int. Conf. on Computer Vision, ICCV 2015, 1026–1034, DOI: 10.1109/ICCV.2015.123 (IEEE Computer Society, Santiago, Chile, 2015).
  27. Eldele, E. et al. Self-supervised learning for label-efficient sleep stage classification: A comprehensive evaluation. \JournalTitleIEEE T. Neur. Sys. Reh. 31, 1333–1342, DOI: 10.1109/tnsre.2023.3245285 (2023).
  28. Alvarez-Estevez, D. Challenges of applying automated polysomnography scoring at scale. \JournalTitleSleep Med. Clin. 18, 277–292, DOI: 10.1016/j.jsmc.2023.05.002 (2023).
  29. Fiorillo, L. et al. U-Sleep’s resilience to AASM guidelines. \JournalTitlenpj Digit. Medicine 6, DOI: 10.1038/S41746-023-00784-0 (2023).
  30. Current status and prospects of automatic sleep stages scoring: Review. \JournalTitleBiomed. Eng. Lett. 13, 247–272, DOI: 10.1007/s13534-023-00299-3 (2023).
  31. An open-source, high-performance tool for automated sleep staging. \JournalTitleeLife 10, DOI: 10.7554/elife.70092 (2021).
  32. Donckt, M. J. V. D. et al. Do not sleep on traditional machine learning: Simple and interpretable techniques are competitive to deep learning for sleep scoring. \JournalTitleBiomed. Signal Process. Control. 81, 104429, DOI: 10.1016/J.BSPC.2022.104429 (2023).
  33. Vaswani, A. et al. Attention is all you need. In Guyon, I. et al. (eds.) Annu. Conf. Neural Information Processing Systems, NeurIPS, 5998–6008 (Long Beach, CA, USA, 2017).
  34. Brandmayr, G. et al. Relational local electroencephalography representations for sleep scoring. \JournalTitleNeural Networks 154, 310–322, DOI: 10.1016/J.NEUNET.2022.07.020 (2022).
  35. Dosovitskiy, A. et al. An image is worth 16x16 words: Transformers for image recognition at scale. In Int. Conf. on Learning Representations, ICLR (2021).
  36. Zhang, G.-Q. et al. The national sleep research resource: Towards a sleep data commons. \JournalTitleJ. Am. Med. Inform. Assn. 25, 1351–1358, DOI: 10.1093/jamia/ocy064 (2018).
  37. The Temple University hospital EEG data corpus. \JournalTitleFront. Neurosci. 10, DOI: 10.3389/fnins.2016.00196 (2016).

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.