Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A General Framework for Adaptation of Neural Machine Translation to Simultaneous Translation (1911.03154v2)

Published 8 Nov 2019 in cs.CL

Abstract: Despite the success of neural machine translation (NMT), simultaneous neural machine translation (SNMT), the task of translating in real time before a full sentence has been observed, remains challenging due to the syntactic structure difference and simultaneity requirements. In this paper, we propose a general framework for adapting neural machine translation to translate simultaneously. Our framework contains two parts: prefix translation that utilizes a consecutive NMT model to translate source prefixes and a stopping criterion that determines when to stop the prefix translation. Experiments on three translation corpora and two language pairs show the efficacy of the proposed framework on balancing the quality and latency in adapting NMT to perform simultaneous translation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yun Chen (134 papers)
  2. Liangyou Li (36 papers)
  3. Xin Jiang (242 papers)
  4. Xiao Chen (277 papers)
  5. Qun Liu (230 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.