Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

TunBERT: Pretrained Contextualized Text Representation for Tunisian Dialect (2111.13138v1)

Published 25 Nov 2021 in cs.CL and cs.LG

Abstract: Pretrained contextualized text representation models learn an effective representation of a natural language to make it machine understandable. After the breakthrough of the attention mechanism, a new generation of pretrained models have been proposed achieving good performances since the introduction of the Transformer. Bidirectional Encoder Representations from Transformers (BERT) has become the state-of-the-art model for language understanding. Despite their success, most of the available models have been trained on Indo-European languages however similar research for under-represented languages and dialects remains sparse. In this paper, we investigate the feasibility of training monolingual Transformer-based LLMs for under represented languages, with a specific focus on the Tunisian dialect. We evaluate our LLM on sentiment analysis task, dialect identification task and reading comprehension question-answering task. We show that the use of noisy web crawled data instead of structured data (Wikipedia, articles, etc.) is more convenient for such non-standardized language. Moreover, results indicate that a relatively small web crawled dataset leads to performances that are as good as those obtained using larger datasets. Finally, our best performing TunBERT model reaches or improves the state-of-the-art in all three downstream tasks. We release the TunBERT pretrained model and the datasets used for fine-tuning.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Abir Messaoudi (7 papers)
  2. Ahmed Cheikhrouhou (1 paper)
  3. Hatem Haddad (8 papers)
  4. Nourchene Ferchichi (1 paper)
  5. Moez BenHajhmida (1 paper)
  6. Abir Korched (1 paper)
  7. Malek Naski (2 papers)
  8. Faten Ghriss (1 paper)
  9. Amine Kerkeni (4 papers)
Citations (7)