Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sentence Concatenation Approach to Data Augmentation for Neural Machine Translation (2104.08478v1)

Published 17 Apr 2021 in cs.CL

Abstract: Neural machine translation (NMT) has recently gained widespread attention because of its high translation accuracy. However, it shows poor performance in the translation of long sentences, which is a major issue in low-resource languages. It is assumed that this issue is caused by insufficient number of long sentences in the training data. Therefore, this study proposes a simple data augmentation method to handle long sentences. In this method, we use only the given parallel corpora as the training data and generate long sentences by concatenating two sentences. Based on the experimental results, we confirm improvements in long sentence translation by the proposed data augmentation method, despite its simplicity. Moreover, the translation quality is further improved by the proposed method, when combined with back-translation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Seiichiro Kondo (1 paper)
  2. Kengo Hotate (2 papers)
  3. Masahiro Kaneko (46 papers)
  4. Mamoru Komachi (40 papers)
Citations (15)