Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

WeChat Neural Machine Translation Systems for WMT20 (2010.00247v2)

Published 1 Oct 2020 in cs.CL, cs.AI, and cs.LG

Abstract: We participate in the WMT 2020 shared news translation task on Chinese to English. Our system is based on the Transformer (Vaswani et al., 2017a) with effective variants and the DTMT (Meng and Zhang, 2019) architecture. In our experiments, we employ data selection, several synthetic data generation approaches (i.e., back-translation, knowledge distillation, and iterative in-domain knowledge transfer), advanced finetuning approaches and self-bleu based model ensemble. Our constrained Chinese to English system achieves 36.9 case-sensitive BLEU score, which is the highest among all submissions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Fandong Meng (174 papers)
  2. Jianhao Yan (27 papers)
  3. Yijin Liu (29 papers)
  4. Yuan Gao (336 papers)
  5. Xianfeng Zeng (5 papers)
  6. Qinsong Zeng (4 papers)
  7. Peng Li (390 papers)
  8. Ming Chen (124 papers)
  9. Jie Zhou (687 papers)
  10. Sifan Liu (28 papers)
  11. Hao Zhou (351 papers)
Citations (21)

Summary

We haven't generated a summary for this paper yet.