Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Synthetic Source Language Augmentation for Colloquial Neural Machine Translation (2012.15178v1)

Published 30 Dec 2020 in cs.CL and cs.LG

Abstract: Neural machine translation (NMT) is typically domain-dependent and style-dependent, and it requires lots of training data. State-of-the-art NMT models often fall short in handling colloquial variations of its source language and the lack of parallel data in this regard is a challenging hurdle in systematically improving the existing models. In this work, we develop a novel colloquial Indonesian-English test-set collected from YouTube transcript and Twitter. We perform synthetic style augmentation to the source of formal Indonesian language and show that it improves the baseline Id-En models (in BLEU) over the new test data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Asrul Sani Ariesandy (1 paper)
  2. Mukhlis Amien (4 papers)
  3. Alham Fikri Aji (94 papers)
  4. Radityo Eko Prasojo (13 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.