Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SYSTRAN Purely Neural MT Engines for WMT2017 (1709.03814v1)

Published 12 Sep 2017 in cs.CL

Abstract: This paper describes SYSTRAN's systems submitted to the WMT 2017 shared news translation task for English-German, in both translation directions. Our systems are built using OpenNMT, an open-source neural machine translation system, implementing sequence-to-sequence models with LSTM encoder/decoders and attention. We experimented using monolingual data automatically back-translated. Our resulting models are further hyper-specialised with an adaptation technique that finely tunes models according to the evaluation test sentences.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Yongchao Deng (3 papers)
  2. Jungi Kim (4 papers)
  3. Guillaume Klein (7 papers)
  4. Catherine Kobus (4 papers)
  5. Natalia Segal (2 papers)
  6. Christophe Servan (16 papers)
  7. Bo Wang (823 papers)
  8. Dakun Zhang (3 papers)
  9. Josep Crego (15 papers)
  10. Jean Senellart (17 papers)
Citations (5)