Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transformer-based Single-Cell Language Model: A Survey (2407.13205v1)

Published 18 Jul 2024 in cs.CL

Abstract: The transformers have achieved significant accomplishments in the natural language processing as its outstanding parallel processing capabilities and highly flexible attention mechanism. In addition, increasing studies based on transformers have been proposed to model single-cell data. In this review, we attempt to systematically summarize the single-cell LLMs and applications based on transformers. First, we provide a detailed introduction about the structure and principles of transformers. Then, we review the single-cell LLMs and LLMs for single-cell data analysis. Moreover, we explore the datasets and applications of single-cell LLMs in downstream tasks such as batch correction, cell clustering, cell type annotation, gene regulatory network inference and perturbation response. Further, we discuss the challenges of single-cell LLMs and provide promising research directions. We hope this review will serve as an up-to-date reference for researchers interested in the direction of single-cell LLMs.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets