Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ON-TRAC Consortium for End-to-End and Simultaneous Speech Translation Challenge Tasks at IWSLT 2020 (2005.11861v1)

Published 24 May 2020 in cs.CL and eess.AS

Abstract: This paper describes the ON-TRAC Consortium translation systems developed for two challenge tracks featured in the Evaluation Campaign of IWSLT 2020, offline speech translation and simultaneous speech translation. ON-TRAC Consortium is composed of researchers from three French academic laboratories: LIA (Avignon Universit\'e), LIG (Universit\'e Grenoble Alpes), and LIUM (Le Mans Universit\'e). Attention-based encoder-decoder models, trained end-to-end, were used for our submissions to the offline speech translation track. Our contributions focused on data augmentation and ensembling of multiple models. In the simultaneous speech translation track, we build on Transformer-based wait-k models for the text-to-text subtask. For speech-to-text simultaneous translation, we attach a wait-k MT system to a hybrid ASR system. We propose an algorithm to control the latency of the ASR+MT cascade and achieve a good latency-quality trade-off on both subtasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Maha Elbayad (17 papers)
  2. Ha Nguyen (14 papers)
  3. Fethi Bougares (18 papers)
  4. Natalia Tomashenko (32 papers)
  5. Antoine Caubrière (9 papers)
  6. Benjamin Lecouteux (14 papers)
  7. Yannick Estève (45 papers)
  8. Laurent Besacier (76 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.