Transformer-based Single-Cell Language Model: A Survey (2407.13205v1)
Abstract: The transformers have achieved significant accomplishments in the natural language processing as its outstanding parallel processing capabilities and highly flexible attention mechanism. In addition, increasing studies based on transformers have been proposed to model single-cell data. In this review, we attempt to systematically summarize the single-cell LLMs and applications based on transformers. First, we provide a detailed introduction about the structure and principles of transformers. Then, we review the single-cell LLMs and LLMs for single-cell data analysis. Moreover, we explore the datasets and applications of single-cell LLMs in downstream tasks such as batch correction, cell clustering, cell type annotation, gene regulatory network inference and perturbation response. Further, we discuss the challenges of single-cell LLMs and provide promising research directions. We hope this review will serve as an up-to-date reference for researchers interested in the direction of single-cell LLMs.