Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Context Gates for Neural Machine Translation (1608.06043v3)

Published 22 Aug 2016 in cs.CL

Abstract: In neural machine translation (NMT), generation of a target word depends on both source and target contexts. We find that source contexts have a direct impact on the adequacy of a translation while target contexts affect the fluency. Intuitively, generation of a content word should rely more on the source context and generation of a functional word should rely more on the target context. Due to the lack of effective control over the influence from source and target contexts, conventional NMT tends to yield fluent but inadequate translations. To address this problem, we propose context gates which dynamically control the ratios at which source and target contexts contribute to the generation of target words. In this way, we can enhance both the adequacy and fluency of NMT with more careful control of the information flow from contexts. Experiments show that our approach significantly improves upon a standard attention-based NMT system by +2.3 BLEU points.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Zhaopeng Tu (135 papers)
  2. Yang Liu (2253 papers)
  3. Zhengdong Lu (35 papers)
  4. Xiaohua Liu (9 papers)
  5. Hang Li (277 papers)
Citations (131)

Summary

We haven't generated a summary for this paper yet.