Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African Languages (2010.03179v1)

Published 7 Oct 2020 in cs.CL and cs.LG

Abstract: Multilingual transformer models like mBERT and XLM-RoBERTa have obtained great improvements for many NLP tasks on a variety of languages. However, recent works also showed that results from high-resource languages could not be easily transferred to realistic, low-resource scenarios. In this work, we study trends in performance for different amounts of available resources for the three African languages Hausa, isiXhosa and Yor`ub\'a on both NER and topic classification. We show that in combination with transfer learning or distant supervision, these models can achieve with as little as 10 or 100 labeled sentences the same performance as baselines with much more supervised training data. However, we also find settings where this does not hold. Our discussions and additional experiments on assumptions such as time and hardware restrictions highlight challenges and opportunities in low-resource learning.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Michael A. Hedderich (28 papers)
  2. David Adelani (7 papers)
  3. Dawei Zhu (47 papers)
  4. Jesujoba Alabi (11 papers)
  5. Udia Markus (1 paper)
  6. Dietrich Klakow (114 papers)
Citations (67)

Summary

We haven't generated a summary for this paper yet.