Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Phrase-to-Phrase Machine Translation (1811.02172v1)

Published 6 Nov 2018 in cs.CL, cs.LG, and stat.ML

Abstract: In this paper, we propose Neural Phrase-to-Phrase Machine Translation (NP$2$MT). Our model uses a phrase attention mechanism to discover relevant input (source) segments that are used by a decoder to generate output (target) phrases. We also design an efficient dynamic programming algorithm to decode segments that allows the model to be trained faster than the existing neural phrase-based machine translation method by Huang et al. (2018). Furthermore, our method can naturally integrate with external phrase dictionaries during decoding. Empirical experiments show that our method achieves comparable performance with the state-of-the art methods on benchmark datasets. However, when the training and testing data are from different distributions or domains, our method performs better.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Jiangtao Feng (24 papers)
  2. Lingpeng Kong (134 papers)
  3. Po-Sen Huang (30 papers)
  4. Chong Wang (308 papers)
  5. Da Huang (67 papers)
  6. Jiayuan Mao (55 papers)
  7. Kan Qiao (1 paper)
  8. Dengyong Zhou (20 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.