Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Online Learning for Neural Machine Translation Post-editing (1706.03196v1)

Published 10 Jun 2017 in cs.LG and cs.CL

Abstract: Neural machine translation has meant a revolution of the field. Nevertheless, post-editing the outputs of the system is mandatory for tasks requiring high translation quality. Post-editing offers a unique opportunity for improving neural machine translation systems, using online learning techniques and treating the post-edited translations as new, fresh training data. We review classical learning methods and propose a new optimization algorithm. We thoroughly compare online learning algorithms in a post-editing scenario. Results show significant improvements in translation quality and effort reduction.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Álvaro Peris (12 papers)
  2. Luis Cebrián (1 paper)
  3. Francisco Casacuberta (19 papers)
Citations (32)