Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

HemaGraph: Breaking Barriers in Hematologic Single Cell Classification with Graph Attention (2402.18611v1)

Published 28 Feb 2024 in q-bio.QM, cs.LG, and q-bio.CB

Abstract: In the realm of hematologic cell populations classification, the intricate patterns within flow cytometry data necessitate advanced analytical tools. This paper presents 'HemaGraph', a novel framework based on Graph Attention Networks (GATs) for single-cell multi-class classification of hematological cells from flow cytometry data. Harnessing the power of GATs, our method captures subtle cell relationships, offering highly accurate patient profiling. Based on evaluation of data from 30 patients, HemaGraph demonstrates classification performance across five different cell classes, outperforming traditional methodologies and state-of-the-art methods. Moreover, the uniqueness of this framework lies in the training and testing phase of HemaGraph, where it has been applied for extremely large graphs, containing up to hundreds of thousands of nodes and two million edges, to detect low frequency cell populations (e.g. 0.01% for one population), with accuracies reaching 98%. Our findings underscore the potential of HemaGraph in improving hematoligic multi-class classification, paving the way for patient-personalized interventions. To the best of our knowledge, this is the first effort to use GATs, and Graph Neural Networks (GNNs) in general, to classify cell populations from single-cell flow cytometry data. We envision applying this method to single-cell data from larger cohort of patients and on other hematologic diseases.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. Flowcyt: A comparative study of deep learning approaches for multi-class classification in flow cytometry. 2024.
  2. Prediction of dynamical properties of biochemical pathways with graph neural networks. In Bioinformatics, pages 32–43, 2020.
  3. Real-time bidding by reinforcement learning in display advertising. In Proceedings of the tenth ACM international conference on web search and data mining, pages 661–670, 2017.
  4. Graph neural networks for recommender system. In Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining, pages 1623–1625, 2022.
  5. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics, pages 249–256. JMLR Workshop and Conference Proceedings, 2010.
  6. Why do tree-based models still outperform deep learning on typical tabular data? Advances in Neural Information Processing Systems, 35:507–520, 2022.
  7. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
  8. Neural collaborative filtering. In Proceedings of the 26th international conference on world wide web, pages 173–182, 2017.
  9. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
  10. Fi-gnn: Modeling feature interactions via graph neural networks for ctr prediction. In Proceedings of the 28th ACM international conference on information and knowledge management, pages 539–548, 2019.
  11. Automated flow cytometric mrd assessment in childhood acute b-lymphoblastic leukemia using supervised machine learning. Cytometry Part A, 95(9):966–975, 2019.
  12. Steffen Rendle. Factorization machines. In 2010 IEEE International conference on data mining, pages 995–1000. IEEE, 2010.
  13. Artificial intelligence enhances diagnostic flow cytometry workflow in the detection of minimal residual disease of chronic lymphocytic leukemia. Cancers, 14(10):2537, 2022.
  14. Glae: A graph-learnable auto-encoder for single-cell rna-seq analysis. Information Sciences, 621:88–103, 2023.
  15. Autoint: Automatic feature interaction learning via self-attentive neural networks. In Proceedings of the 28th ACM international conference on information and knowledge management, pages 1161–1170, 2019.
  16. Graph neural networks in particle physics: Implementations, innovations, and challenges. arXiv preprint arXiv:2203.12852, 2022.
  17. Laurens Van der Maaten and Geoffrey Hinton. Visualizing data using t-sne. Journal of machine learning research, 9(11), 2008.
  18. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  19. Graph attention networks. stat, 1050(20):10–48550, 2017.
  20. Graph learning approaches to recommender systems: A review. arXiv preprint arXiv:2004.11718, 2020.
  21. scgnn is a novel graph neural network framework for single-cell rna-seq analyses. Nature communications, 12(1):1882, 2021.
  22. Dcn v2: Improved deep & cross network and practical lessons for web-scale learning to rank systems. In Proceedings of the web conference 2021, pages 1785–1797, 2021.
  23. Umap based anomaly detection for minimal residual disease quantification within acute myeloid leukemia. Cancers, 14(4):898, 2022.
  24. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
  25. Revisiting semi-supervised learning with graph embeddings. In International conference on machine learning, pages 40–48. PMLR, 2016.
  26. Neo-gnns: Neighborhood overlap-aware graph neural networks for link prediction. Advances in Neural Information Processing Systems, 34:13683–13694, 2021.
  27. Graph neural networks: A review of methods and applications. AI open, 1:57–81, 2020.

Summary

We haven't generated a summary for this paper yet.