2000 character limit reached
From Bilingual to Multilingual Neural Machine Translation by Incremental Training (1907.00735v2)
Published 28 Jun 2019 in cs.CL
Abstract: Multilingual Neural Machine Translation approaches are based on the use of task-specific models and the addition of one more language can only be done by retraining the whole system. In this work, we propose a new training schedule that allows the system to scale to more languages without modification of the previous components based on joint training and language-independent encoder/decoder modules allowing for zero-shot translation. This work in progress shows close results to the state-of-the-art in the WMT task.
- Carlos Escolano (20 papers)
- José A. R. Fonollosa (23 papers)
- Marta R. Costa-jussà (73 papers)