Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Scheduled Multi-Task Learning: From Syntax to Translation (1804.08915v1)

Published 24 Apr 2018 in cs.CL

Abstract: Neural encoder-decoder models of machine translation have achieved impressive results, while learning linguistic knowledge of both the source and target languages in an implicit end-to-end manner. We propose a framework in which our model begins learning syntax and translation interleaved, gradually putting more focus on translation. Using this approach, we achieve considerable improvements in terms of BLEU score on relatively large parallel corpus (WMT14 English to German) and a low-resource (WIT German to English) setup.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Eliyahu Kiperwasser (5 papers)
  2. Miguel Ballesteros (70 papers)
Citations (80)

Summary

We haven't generated a summary for this paper yet.