Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Isochrony-Aware Neural Machine Translation for Automatic Dubbing (2112.08548v2)

Published 16 Dec 2021 in cs.CL

Abstract: We introduce the task of isochrony-aware machine translation which aims at generating translations suitable for dubbing. Dubbing of a spoken sentence requires transferring the content as well as the speech-pause structure of the source into the target language to achieve audiovisual coherence. Practically, this implies correctly projecting pauses from the source to the target and ensuring that target speech segments have roughly the same duration of the corresponding source speech segments. In this work, we propose implicit and explicit modeling approaches to integrate isochrony information into neural machine translation. Experiments on English-German/French language pairs with automatic metrics show that the simplest of the considered approaches works best. Results are confirmed by human evaluations of translations and dubbed videos.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Derek Tam (10 papers)
  2. Surafel M. Lakew (12 papers)
  3. Yogesh Virkar (9 papers)
  4. Prashant Mathur (21 papers)
  5. Marcello Federico (38 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.