Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Korean-English Machine Translation with Multiple Tokenization Strategy (2105.14274v3)

Published 29 May 2021 in cs.CL

Abstract: This work was conducted to find out how tokenization methods affect the training results of machine translation models. In this work, alphabet tokenization, morpheme tokenization, and BPE tokenization were applied to Korean as the source language and English as the target language respectively, and the comparison experiment was conducted by repeating 50,000 epochs of each 9 models using the Transformer neural network. As a result of measuring the BLEU scores of the experimental models, the model that applied BPE tokenization to Korean and morpheme tokenization to English recorded 35.73, showing the best performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Dojun Park (8 papers)
  2. Youngjin Jang (2 papers)
  3. Harksoo Kim (8 papers)

Summary

We haven't generated a summary for this paper yet.