Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multilingual Document-Level Translation Enables Zero-Shot Transfer From Sentences to Documents (2109.10341v2)

Published 21 Sep 2021 in cs.CL and cs.LG

Abstract: Document-level neural machine translation (DocNMT) achieves coherent translations by incorporating cross-sentence context. However, for most language pairs there's a shortage of parallel documents, although parallel sentences are readily available. In this paper, we study whether and how contextual modeling in DocNMT is transferable via multilingual modeling. We focus on the scenario of zero-shot transfer from teacher languages with document level data to student languages with no documents but sentence level data, and for the first time treat document-level translation as a transfer learning problem. Using simple concatenation-based DocNMT, we explore the effect of 3 factors on the transfer: the number of teacher languages with document level data, the balance between document and sentence level data at training, and the data condition of parallel documents (genuine vs. backtranslated). Our experiments on Europarl-7 and IWSLT-10 show the feasibility of multilingual transfer for DocNMT, particularly on document-specific metrics. We observe that more teacher languages and adequate data balance both contribute to better transfer quality. Surprisingly, the transfer is less sensitive to the data condition, where multilingual DocNMT delivers decent performance with either backtranslated or genuine document pairs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Biao Zhang (76 papers)
  2. Ankur Bapna (53 papers)
  3. Melvin Johnson (35 papers)
  4. Ali Dabirmoghaddam (3 papers)
  5. Naveen Arivazhagan (15 papers)
  6. Orhan Firat (80 papers)
Citations (10)