Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Machine Translation with Pivot Languages (1611.04928v2)

Published 15 Nov 2016 in cs.CL

Abstract: While recent neural machine translation approaches have delivered state-of-the-art performance for resource-rich language pairs, they suffer from the data scarcity problem for resource-scarce language pairs. Although this problem can be alleviated by exploiting a pivot language to bridge the source and target languages, the source-to-pivot and pivot-to-target translation models are usually independently trained. In this work, we introduce a joint training algorithm for pivot-based neural machine translation. We propose three methods to connect the two models and enable them to interact with each other during training. Experiments on Europarl and WMT corpora show that joint training of source-to-pivot and pivot-to-target models leads to significant improvements over independent training across various languages.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yong Cheng (58 papers)
  2. Yang Liu (2253 papers)
  3. Qian Yang (146 papers)
  4. Maosong Sun (337 papers)
  5. Wei Xu (536 papers)
Citations (93)

Summary

We haven't generated a summary for this paper yet.