Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism (1810.06195v1)

Published 15 Oct 2018 in cs.CL

Abstract: Pronouns are frequently omitted in pro-drop languages, such as Chinese, generally leading to significant challenges with respect to the production of complete translations. Recently, Wang et al. (2018) proposed a novel reconstruction-based approach to alleviating dropped pronoun (DP) translation problems for neural machine translation models. In this work, we improve the original model from two perspectives. First, we employ a shared reconstructor to better exploit encoder and decoder representations. Second, we jointly learn to translate and predict DPs in an end-to-end manner, to avoid the errors propagated from an external DP prediction model. Experimental results show that our approach significantly improves both translation performance and DP prediction accuracy.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Longyue Wang (87 papers)
  2. Zhaopeng Tu (135 papers)
  3. Andy Way (46 papers)
  4. Qun Liu (230 papers)
Citations (27)