Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deconvolution-Based Global Decoding for Neural Machine Translation (1806.03692v1)

Published 10 Jun 2018 in cs.CL, cs.AI, and cs.LG

Abstract: A great proportion of sequence-to-sequence (Seq2Seq) models for Neural Machine Translation (NMT) adopt Recurrent Neural Network (RNN) to generate translation word by word following a sequential order. As the studies of linguistics have proved that language is not linear word sequence but sequence of complex structure, translation at each step should be conditioned on the whole target-side context. To tackle the problem, we propose a new NMT model that decodes the sequence with the guidance of its structural prediction of the context of the target sequence. Our model generates translation based on the structural prediction of the target-side context so that the translation can be freed from the bind of sequential order. Experimental results demonstrate that our model is more competitive compared with the state-of-the-art methods, and the analysis reflects that our model is also robust to translating sentences of different lengths and it also reduces repetition with the instruction from the target-side context for decoding.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Junyang Lin (99 papers)
  2. Xu Sun (194 papers)
  3. Xuancheng Ren (59 papers)
  4. Shuming Ma (83 papers)
  5. Jinsong Su (96 papers)
  6. Qi Su (58 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.