Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Template-based Method for Constrained Neural Machine Translation (2205.11255v2)

Published 23 May 2022 in cs.CL

Abstract: Machine translation systems are expected to cope with various types of constraints in many practical scenarios. While neural machine translation (NMT) has achieved strong performance in unconstrained cases, it is non-trivial to impose pre-specified constraints into the translation process of NMT models. Although many approaches have been proposed to address this issue, most existing methods can not satisfy the following three desiderata at the same time: (1) high translation quality, (2) high match accuracy, and (3) low latency. In this work, we propose a template-based method that can yield results with high translation quality and match accuracy and the inference speed of our method is comparable with unconstrained NMT models. Our basic idea is to rearrange the generation of constrained and unconstrained tokens through a template. Our method does not require any changes in the model architecture and the decoding algorithm. Experimental results show that the proposed template-based approach can outperform several representative baselines in both lexically and structurally constrained translation tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shuo Wang (382 papers)
  2. Peng Li (390 papers)
  3. Zhixing Tan (20 papers)
  4. Zhaopeng Tu (135 papers)
  5. Maosong Sun (337 papers)
  6. Yang Liu (2253 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.