Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Advancing Multilingual Pre-training: TRIP Triangular Document-level Pre-training for Multilingual Language Models (2212.07752v2)

Published 15 Dec 2022 in cs.CL

Abstract: Despite the success of multilingual sequence-to-sequence pre-training, most existing approaches rely on document-level monolingual corpora in many different languages, sentence-level bilingual corpora,\footnote{In this paper, we use bilingual corpora' to denote parallel corpora withbilingual translation pairs' in many different language pairs, each consisting of two sentences/documents with the same meaning written in different languages. We use trilingual corpora' to denote parallel corpora withtrilingual translation pairs' in many different language combinations, each consisting of three sentences/documents.} and sometimes synthetic document-level bilingual corpora. This hampers the performance with cross-lingual document-level tasks such as document-level translation. Therefore, we propose to mine and leverage document-level trilingual parallel corpora to improve sequence-to-sequence multilingual pre-training. We present \textbf{Tri}angular Document-level \textbf{P}re-training (\textbf{TRIP}), which is the first in the field to accelerate the conventional monolingual and bilingual objectives into a trilingual objective with a novel method called Grafting. Experiments show that TRIP achieves several strong state-of-the-art (SOTA) scores on three multilingual document-level machine translation benchmarks and one cross-lingual abstractive summarization benchmark, including consistent improvements by up to 3.11 d-BLEU points and 8.9 ROUGE-L points.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Hongyuan Lu (18 papers)
  2. Haoyang Huang (27 papers)
  3. Shuming Ma (83 papers)
  4. Dongdong Zhang (79 papers)
  5. Wai Lam (117 papers)
  6. Furu Wei (291 papers)
Citations (3)