Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Robust Neural Machine Translation (1805.06130v1)

Published 16 May 2018 in cs.CL

Abstract: Small perturbations in the input can severely distort intermediate representations and thus impact translation quality of neural machine translation (NMT) models. In this paper, we propose to improve the robustness of NMT models with adversarial stability training. The basic idea is to make both the encoder and decoder in NMT models robust against input perturbations by enabling them to behave similarly for the original input and its perturbed counterpart. Experimental results on Chinese-English, English-German and English-French translation tasks show that our approaches can not only achieve significant improvements over strong NMT systems but also improve the robustness of NMT models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yong Cheng (58 papers)
  2. Zhaopeng Tu (135 papers)
  3. Fandong Meng (174 papers)
  4. Junjie Zhai (7 papers)
  5. Yang Liu (2253 papers)
Citations (157)

Summary

We haven't generated a summary for this paper yet.