Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Pretraining ECG Data with Adversarial Masking Improves Model Generalizability for Data-Scarce Tasks (2211.07889v1)

Published 15 Nov 2022 in cs.LG, cs.AI, and eess.SP

Abstract: Medical datasets often face the problem of data scarcity, as ground truth labels must be generated by medical professionals. One mitigation strategy is to pretrain deep learning models on large, unlabelled datasets with self-supervised learning (SSL). Data augmentations are essential for improving the generalizability of SSL-trained models, but they are typically handcrafted and tuned manually. We use an adversarial model to generate masks as augmentations for 12-lead electrocardiogram (ECG) data, where masks learn to occlude diagnostically-relevant regions of the ECGs. Compared to random augmentations, adversarial masking reaches better accuracy when transferring to to two diverse downstream objectives: arrhythmia classification and gender classification. Compared to a state-of-art ECG augmentation method 3KG, adversarial masking performs better in data-scarce regimes, demonstrating the generalizability of our model.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jessica Y. Bo (6 papers)
  2. Hen-Wei Huang (1 paper)
  3. Alvin Chan (15 papers)
  4. Giovanni Traverso (1 paper)
Citations (4)

Summary

We haven't generated a summary for this paper yet.