Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CNNs, LSTMs, and Attention Networks for Pathology Detection in Medical Data (1912.00852v1)

Published 2 Dec 2019 in cs.LG and stat.ML

Abstract: For the weakly supervised task of electrocardiogram (ECG) rhythm classification, convolutional neural networks (CNNs) and long short-term memory (LSTM) networks are two increasingly popular classification models. This work investigates whether a combination of both architectures to so-called convolutional long short-term memory (ConvLSTM) networks can improve classification performances by explicitly capturing morphological as well as temporal features of raw ECG records. In addition, various attention mechanisms are studied to localize and visualize record sections of abnormal morphology and irregular rhythm. The resulting saliency maps are supposed to not only allow for a better network understanding but to also improve clinicians' acceptance of automatic diagnosis in order to avoid the technique being labeled as a black box. In further experiments, attention mechanisms are actively incorporated into the training process by learning a few additional attention gating parameters in a CNN model. An 8-fold cross validation is finally carried out on the PhysioNet Computing in Cardiology (CinC) challenge 2017 to compare the performances of standard CNN models, ConvLSTMs, and attention gated CNNs.

Citations (6)

Summary

We haven't generated a summary for this paper yet.