2000 character limit reached
The NiuTrans Machine Translation Systems for WMT21 (2109.10485v1)
Published 22 Sep 2021 in cs.CL
Abstract: This paper describes NiuTrans neural machine translation systems of the WMT 2021 news translation tasks. We made submissions to 9 language directions, including English$\leftrightarrow$${$Chinese, Japanese, Russian, Icelandic$}$ and English$\rightarrow$Hausa tasks. Our primary systems are built on several effective variants of Transformer, e.g., Transformer-DLCL, ODE-Transformer. We also utilize back-translation, knowledge distillation, post-ensemble, and iterative fine-tuning techniques to enhance the model performance further.
- Shuhan Zhou (8 papers)
- Tao Zhou (398 papers)
- Binghao Wei (1 paper)
- Yingfeng Luo (9 papers)
- Yongyu Mu (15 papers)
- Zefan Zhou (3 papers)
- Chenglong Wang (80 papers)
- Xuanjun Zhou (1 paper)
- Chuanhao Lv (3 papers)
- Yi Jing (9 papers)
- Laohu Wang (2 papers)
- Jingnan Zhang (3 papers)
- Canan Huang (3 papers)
- Zhongxiang Yan (2 papers)
- Chi Hu (9 papers)
- Bei Li (51 papers)
- Tong Xiao (119 papers)
- Jingbo Zhu (79 papers)