Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LaTiM: Longitudinal representation learning in continuous-time models to predict disease progression (2404.07091v1)

Published 10 Apr 2024 in cs.LG and cs.AI

Abstract: This work proposes a novel framework for analyzing disease progression using time-aware neural ordinary differential equations (NODE). We introduce a "time-aware head" in a framework trained through self-supervised learning (SSL) to leverage temporal information in latent space for data augmentation. This approach effectively integrates NODEs with SSL, offering significant performance improvements compared to traditional methods that lack explicit temporal integration. We demonstrate the effectiveness of our strategy for diabetic retinopathy progression prediction using the OPHDIAT database. Compared to the baseline, all NODE architectures achieve statistically significant improvements in area under the ROC curve (AUC) and Kappa metrics, highlighting the efficacy of pre-training with SSL-inspired approaches. Additionally, our framework promotes stable training for NODEs, a commonly encountered challenge in time-aware modeling.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Rachid Zeghlache (10 papers)
  2. Pierre-Henri Conze (38 papers)
  3. Mostafa El Habib Daho (14 papers)
  4. Yihao Li (30 papers)
  5. Hugo Le Boité (7 papers)
  6. Ramin Tadayoni (11 papers)
  7. Pascal Massin (5 papers)
  8. Béatrice Cochener (22 papers)
  9. Alireza Rezaei (9 papers)
  10. Ikram Brahim (7 papers)
  11. Gwenolé Quellec (34 papers)
  12. Mathieu Lamard (27 papers)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets