2000 character limit reached
SYSTRAN Purely Neural MT Engines for WMT2017 (1709.03814v1)
Published 12 Sep 2017 in cs.CL
Abstract: This paper describes SYSTRAN's systems submitted to the WMT 2017 shared news translation task for English-German, in both translation directions. Our systems are built using OpenNMT, an open-source neural machine translation system, implementing sequence-to-sequence models with LSTM encoder/decoders and attention. We experimented using monolingual data automatically back-translated. Our resulting models are further hyper-specialised with an adaptation technique that finely tunes models according to the evaluation test sentences.
- Yongchao Deng (3 papers)
- Jungi Kim (4 papers)
- Guillaume Klein (7 papers)
- Catherine Kobus (4 papers)
- Natalia Segal (2 papers)
- Christophe Servan (16 papers)
- Bo Wang (823 papers)
- Dakun Zhang (3 papers)
- Josep Crego (15 papers)
- Jean Senellart (17 papers)