Papers
Topics
Authors
Recent
2000 character limit reached

NucEL: Single-Nucleotide ELECTRA-Style Genomic Pre-training for Efficient and Interpretable Representations (2508.13191v1)

Published 15 Aug 2025 in q-bio.GN

Abstract: Pre-training LLMs on genomic sequences is a powerful approach for learning biologically meaningful representations. Masked language modeling (MLM) methods, such as DNABERT and Nucleotide Transformer (NT), achieve strong performance but suffer from partial token supervision, pre-training/fine-tuning mismatches, and high computational costs. We introduce NucEL, the first ELECTRA-style pre-training framework for genomic foundation models, addressing these limitations. Using a discriminator to identify tokens altered by a generator, NucEL provides comprehensive token-level supervision across all sequence positions, improving efficiency over the partial supervision of MLM. Incorporating ModernBERT's hybrid local-global attention and flash attention, NucEL offers an optimized BERT architecture for genomic modeling. Unlike 6-mer tokenization, NucEL uses single-nucleotide tokens for fine-grained resolution, boosting both efficiency and interpretability. Pre-trained on the human genome, NucEL achieves state-of-the-art results on diverse downstream tasks -- regulatory element identification (e.g., promoters, enhancers), transcription factor binding prediction, open chromatin classification, and histone modification profiling -- surpassing similarly sized MLM-based models and rivaling models 25x larger, such as NT. Ablation studies highlight optimal tokenization and masking strategies for ELECTRA-style DNA pre-training. Attention analysis reveals NucEL's superior capture of biologically relevant motifs compared to NT, providing insights into hierarchical learning and regulatory element modeling. These findings demonstrate ELECTRA-style pre-training as an efficient, effective strategy for genomic representation learning with broad implications for genomic research.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.