Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Medical Concept Embedding with Time-Aware Attention (1806.02873v1)

Published 6 Jun 2018 in cs.CL and cs.AI

Abstract: Embeddings of medical concepts such as medication, procedure and diagnosis codes in Electronic Medical Records (EMRs) are central to healthcare analytics. Previous work on medical concept embedding takes medical concepts and EMRs as words and documents respectively. Nevertheless, such models miss out the temporal nature of EMR data. On the one hand, two consecutive medical concepts do not indicate they are temporally close, but the correlations between them can be revealed by the time gap. On the other hand, the temporal scopes of medical concepts often vary greatly (e.g., \textit{common cold} and \textit{diabetes}). In this paper, we propose to incorporate the temporal information to embed medical codes. Based on the Continuous Bag-of-Words model, we employ the attention mechanism to learn a "soft" time-aware context window for each medical concept. Experiments on public and proprietary datasets through clustering and nearest neighbour search tasks demonstrate the effectiveness of our model, showing that it outperforms five state-of-the-art baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xiangrui Cai (10 papers)
  2. Jinyang Gao (35 papers)
  3. Kee Yuan Ngiam (6 papers)
  4. Beng Chin Ooi (79 papers)
  5. Ying Zhang (389 papers)
  6. Xiaojie Yuan (26 papers)
Citations (64)

Summary

We haven't generated a summary for this paper yet.