Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Impact of Domain-Adapted Multilingual Neural Machine Translation in the Medical Domain (2212.02143v1)

Published 5 Dec 2022 in cs.CL

Abstract: Multilingual Neural Machine Translation (MNMT) models leverage many language pairs during training to improve translation quality for low-resource languages by transferring knowledge from high-resource languages. We study the quality of a domain-adapted MNMT model in the medical domain for English-Romanian with automatic metrics and a human error typology annotation which includes terminology-specific error categories. We compare the out-of-domain MNMT with the in-domain adapted MNMT. The in-domain MNMT model outperforms the out-of-domain MNMT in all measured automatic metrics and produces fewer terminology errors.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Miguel Rios (6 papers)
  2. Raluca-Maria Chereji (1 paper)
  3. Alina Secara (1 paper)
  4. Dragos Ciobanu (1 paper)