Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Implicit Distortion and Fertility Models for Attention-based Encoder-Decoder NMT Model (1601.03317v3)

Published 13 Jan 2016 in cs.CL

Abstract: Neural machine translation has shown very promising results lately. Most NMT models follow the encoder-decoder framework. To make encoder-decoder models more flexible, attention mechanism was introduced to machine translation and also other tasks like speech recognition and image captioning. We observe that the quality of translation by attention-based encoder-decoder can be significantly damaged when the alignment is incorrect. We attribute these problems to the lack of distortion and fertility models. Aiming to resolve these problems, we propose new variations of attention-based encoder-decoder and compare them with other models on machine translation. Our proposed method achieved an improvement of 2 BLEU points over the original attention-based encoder-decoder.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Shi Feng (95 papers)
  2. Shujie Liu (101 papers)
  3. Mu Li (95 papers)
  4. Ming Zhou (182 papers)
Citations (44)

Summary

We haven't generated a summary for this paper yet.