Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Non-Imaging Medical Data Synthesis for Trustworthy AI: A Comprehensive Survey (2209.09239v1)

Published 17 Sep 2022 in cs.LG, cs.AI, and cs.CV

Abstract: Data quality is the key factor for the development of trustworthy AI in healthcare. A large volume of curated datasets with controlled confounding factors can help improve the accuracy, robustness and privacy of downstream AI algorithms. However, access to good quality datasets is limited by the technical difficulty of data acquisition and large-scale sharing of healthcare data is hindered by strict ethical restrictions. Data synthesis algorithms, which generate data with a similar distribution as real clinical data, can serve as a potential solution to address the scarcity of good quality data during the development of trustworthy AI. However, state-of-the-art data synthesis algorithms, especially deep learning algorithms, focus more on imaging data while neglecting the synthesis of non-imaging healthcare data, including clinical measurements, medical signals and waveforms, and electronic healthcare records (EHRs). Thus, in this paper, we will review the synthesis algorithms, particularly for non-imaging medical data, with the aim of providing trustworthy AI in this domain. This tutorial-styled review paper will provide comprehensive descriptions of non-imaging medical data synthesis on aspects including algorithms, evaluations, limitations and future research directions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Xiaodan Xing (35 papers)
  2. Huanjun Wu (4 papers)
  3. Lichao Wang (11 papers)
  4. Iain Stenson (1 paper)
  5. May Yong (2 papers)
  6. Javier Del Ser (100 papers)
  7. Simon Walsh (16 papers)
  8. Guang Yang (422 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.