Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

TAPER: Time-Aware Patient EHR Representation (1908.03971v4)

Published 11 Aug 2019 in cs.LG, cs.CL, and stat.ML

Abstract: Effective representation learning of electronic health records is a challenging task and is becoming more important as the availability of such data is becoming pervasive. The data contained in these records are irregular and contain multiple modalities such as notes, and medical codes. They are preempted by medical conditions the patient may have, and are typically jotted down by medical staff. Accompanying codes are notes containing valuable information about patients beyond the structured information contained in electronic health records. We use transformer networks and the recently proposed BERT LLM to embed these data streams into a unified vector representation. The presented approach effectively encodes a patient's visit data into a single distributed representation, which can be used for downstream tasks. Our model demonstrates superior performance and generalization on mortality, readmission and length of stay tasks using the publicly available MIMIC-III ICU dataset. Code avaialble at https://github.com/sajaddarabi/TAPER-EHR

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Sajad Darabi (15 papers)
  2. Mohammad Kachuee (25 papers)
  3. Shayan Fazeli (13 papers)
  4. Majid Sarrafzadeh (30 papers)
Citations (50)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com