Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Facebook AI's WAT19 Myanmar-English Translation Task Submission (1910.06848v1)

Published 15 Oct 2019 in cs.CL

Abstract: This paper describes Facebook AI's submission to the WAT 2019 Myanmar-English translation task. Our baseline systems are BPE-based transformer models. We explore methods to leverage monolingual data to improve generalization, including self-training, back-translation and their combination. We further improve results by using noisy channel re-ranking and ensembling. We demonstrate that these techniques can significantly improve not only a system trained with additional monolingual data, but even the baseline system trained exclusively on the provided small parallel dataset. Our system ranks first in both directions according to human evaluation and BLEU, with a gain of over 8 BLEU points above the second best system.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Peng-Jen Chen (26 papers)
  2. Jiajun Shen (35 papers)
  3. Matt Le (11 papers)
  4. Vishrav Chaudhary (45 papers)
  5. Ahmed El-Kishky (25 papers)
  6. Guillaume Wenzek (12 papers)
  7. Myle Ott (33 papers)
  8. Marc'Aurelio Ranzato (53 papers)
Citations (29)