Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Evaluating the Cross-Lingual Effectiveness of Massively Multilingual Neural Machine Translation (1909.00437v1)

Published 1 Sep 2019 in cs.CL

Abstract: The recently proposed massively multilingual neural machine translation (NMT) system has been shown to be capable of translating over 100 languages to and from English within a single model. Its improved translation performance on low resource languages hints at potential cross-lingual transfer capability for downstream tasks. In this paper, we evaluate the cross-lingual effectiveness of representations from the encoder of a massively multilingual NMT model on 5 downstream classification and sequence labeling tasks covering a diverse set of over 50 languages. We compare against a strong baseline, multilingual BERT (mBERT), in different cross-lingual transfer learning scenarios and show gains in zero-shot transfer in 4 out of these 5 tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Aditya Siddhant (22 papers)
  2. Melvin Johnson (35 papers)
  3. Henry Tsai (5 papers)
  4. Naveen Arivazhagan (15 papers)
  5. Jason Riesa (20 papers)
  6. Ankur Bapna (53 papers)
  7. Orhan Firat (80 papers)
  8. Karthik Raman (26 papers)
Citations (68)