2000 character limit reached
DiDi's Machine Translation System for WMT2020 (2010.08185v1)
Published 16 Oct 2020 in cs.CL and cs.AI
Abstract: This paper describes DiDi AI Labs' submission to the WMT2020 news translation shared task. We participate in the translation direction of Chinese->English. In this direction, we use the Transformer as our baseline model, and integrate several techniques for model enhancement, including data filtering, data selection, back-translation, fine-tuning, model ensembling, and re-ranking. As a result, our submission achieves a BLEU score of $36.6$ in Chinese->English.
- Tanfang Chen (2 papers)
- Weiwei Wang (53 papers)
- Wenyang Wei (1 paper)
- Xing Shi (20 papers)
- Xiangang Li (46 papers)
- Jieping Ye (169 papers)
- Kevin Knight (29 papers)