Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Label-Efficient Sleep Staging Using Transformers Pre-trained with Position Prediction (2404.15308v1)

Published 29 Mar 2024 in eess.SP and cs.LG

Abstract: Sleep staging is a clinically important task for diagnosing various sleep disorders, but remains challenging to deploy at scale because it because it is both labor-intensive and time-consuming. Supervised deep learning-based approaches can automate sleep staging but at the expense of large labeled datasets, which can be unfeasible to procure for various settings, e.g., uncommon sleep disorders. While self-supervised learning (SSL) can mitigate this need, recent studies on SSL for sleep staging have shown performance gains saturate after training with labeled data from only tens of subjects, hence are unable to match peak performance attained with larger datasets. We hypothesize that the rapid saturation stems from applying a sub-optimal pretraining scheme that pretrains only a portion of the architecture, i.e., the feature encoder, but not the temporal encoder; therefore, we propose adopting an architecture that seamlessly couples the feature and temporal encoding and a suitable pretraining scheme that pretrains the entire model. On a sample sleep staging dataset, we find that the proposed scheme offers performance gains that do not saturate with amount of labeled training data (e.g., 3-5\% improvement in balanced sleep staging accuracy across low- to high-labeled data settings), reducing the amount of labeled training data needed for high performance (e.g., by 800 subjects). Based on our findings, we recommend adopting this SSL paradigm for subsequent work on SSL for sleep staging.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (14)
  1. “The AASM manual for the scoring of sleep and associated events” In Rules, Terminology and Technical Specifications, Darien, Illinois, American Academy of Sleep Medicine 176, 2012, pp. 2012
  2. “Performance of an automated polysomnography scoring system versus computer-assisted manual scoring” In Sleep 36.4 Oxford University Press, 2013, pp. 573–582
  3. “Sleeptransformer: Automatic sleep staging with interpretability and uncertainty quantification” In IEEE Transactions on Biomedical Engineering 69.8 IEEE, 2022, pp. 2456–2467
  4. “Automatic sleep staging of EEG signals: recent development, challenges, and future directions” In Physiological Measurement 43.4 IOP Publishing, 2022
  5. “Towards more accurate automatic sleep staging via deep transfer learning” In IEEE Transactions on Biomedical Engineering 68.6 IEEE, 2020, pp. 1787–1798
  6. “Breaking away from labels: The promise of self-supervised machine learning in intelligent health” In Patterns 3.2 Elsevier, 2022
  7. “Self-supervised representation learning from electroencephalography signals” In IEEE International Workshop on Machine Learning for Signal Processing (MLSP), 2019
  8. “Self-Supervised Learning for Label-Efficient Sleep Stage Classification: A Comprehensive Evaluation” In IEEE Transactions on Neural Systems and Rehabilitation Engineering 31 IEEE, 2023, pp. 1333–1342
  9. “MAEEG: Masked Auto-encoder for EEG Representation Learning” In NeurIPS Workshop on Learning from Time Series for Health, 2022
  10. “An attention-based deep learning approach for sleep stage classification with single-channel EEG” In IEEE Transactions on Neural Systems and Rehabilitation Engineering 29 IEEE, 2021, pp. 809–818
  11. “Attention is all you need” In Advances in Neural Information Processing Systems, 2017
  12. “A simple framework for contrastive learning of visual representations” In International Conference on Machine Learning, 2020
  13. “Position Prediction as an Effective Pretraining Strategy” In International Conference on Machine Learning, 2022
  14. “You snooze, you win: the physionet/computing in cardiology challenge 2018” In Computing in Cardiology Conference, 2018

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com