Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SMDT: Selective Memory-Augmented Neural Document Translation (2201.01631v1)

Published 5 Jan 2022 in cs.CL and cs.AI

Abstract: Existing document-level neural machine translation (NMT) models have sufficiently explored different context settings to provide guidance for target generation. However, little attention is paid to inaugurate more diverse context for abundant context information. In this paper, we propose a Selective Memory-augmented Neural Document Translation model to deal with documents containing large hypothesis space of the context. Specifically, we retrieve similar bilingual sentence pairs from the training corpus to augment global context and then extend the two-stream attention model with selective mechanism to capture local context and diverse global contexts. This unified approach allows our model to be trained elegantly on three publicly document-level machine translation datasets and significantly outperforms previous document-level NMT models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Xu Zhang (343 papers)
  2. Jian Yang (503 papers)
  3. Haoyang Huang (27 papers)
  4. Shuming Ma (83 papers)
  5. Dongdong Zhang (79 papers)
  6. Jinlong Li (50 papers)
  7. Furu Wei (291 papers)
Citations (1)