Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LLMs Are Zero-Shot Context-Aware Simultaneous Translators (2406.13476v3)

Published 19 Jun 2024 in cs.CL

Abstract: The advent of transformers has fueled progress in machine translation. More recently LLMs have come to the spotlight thanks to their generality and strong performance in a wide range of language tasks, including translation. Here we show that open-source LLMs perform on par with or better than some state-of-the-art baselines in simultaneous machine translation (SiMT) tasks, zero-shot. We also demonstrate that injection of minimal background information, which is easy with an LLM, brings further performance gains, especially on challenging technical subject-matter. This highlights LLMs' potential for building next generation of massively multilingual, context-aware and terminologically accurate SiMT systems that require no resource-intensive training or fine-tuning.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Roman Koshkin (3 papers)
  2. Katsuhito Sudoh (35 papers)
  3. Satoshi Nakamura (94 papers)