Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modeling Fluency and Faithfulness for Diverse Neural Machine Translation (1912.00178v1)

Published 30 Nov 2019 in cs.CL and cs.LG

Abstract: Neural machine translation models usually adopt the teacher forcing strategy for training which requires the predicted sequence matches ground truth word by word and forces the probability of each prediction to approach a 0-1 distribution. However, the strategy casts all the portion of the distribution to the ground truth word and ignores other words in the target vocabulary even when the ground truth word cannot dominate the distribution. To address the problem of teacher forcing, we propose a method to introduce an evaluation module to guide the distribution of the prediction. The evaluation module accesses each prediction from the perspectives of fluency and faithfulness to encourage the model to generate the word which has a fluent connection with its past and future translation and meanwhile tends to form a translation equivalent in meaning to the source. The experiments on multiple translation tasks show that our method can achieve significant improvements over strong baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yang Feng (230 papers)
  2. Wanying Xie (4 papers)
  3. Shuhao Gu (21 papers)
  4. Chenze Shao (22 papers)
  5. Wen Zhang (170 papers)
  6. Zhengxin Yang (8 papers)
  7. Dong Yu (329 papers)
Citations (22)