Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CTC-based Non-autoregressive Speech Translation (2305.17358v1)

Published 27 May 2023 in cs.CL

Abstract: Combining end-to-end speech translation (ST) and non-autoregressive (NAR) generation is promising in language and speech processing for their advantages of less error propagation and low latency. In this paper, we investigate the potential of connectionist temporal classification (CTC) for non-autoregressive speech translation (NAST). In particular, we develop a model consisting of two encoders that are guided by CTC to predict the source and target texts, respectively. Introducing CTC into NAST on both language sides has obvious challenges: 1) the conditional independent generation somewhat breaks the interdependency among tokens, and 2) the monotonic alignment assumption in standard CTC does not hold in translation tasks. In response, we develop a prediction-aware encoding approach and a cross-layer attention approach to address these issues. We also use curriculum learning to improve convergence of training. Experiments on the MuST-C ST benchmarks show that our NAST model achieves an average BLEU score of 29.5 with a speed-up of 5.67$\times$, which is comparable to the autoregressive counterpart and even outperforms the previous best result of 0.9 BLEU points.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Chen Xu (186 papers)
  2. Xiaoqian Liu (24 papers)
  3. Xiaowen Liu (12 papers)
  4. Qingxuan Sun (1 paper)
  5. Yuhao Zhang (107 papers)
  6. Murun Yang (5 papers)
  7. Qianqian Dong (19 papers)
  8. Tom Ko (31 papers)
  9. Mingxuan Wang (83 papers)
  10. Tong Xiao (119 papers)
  11. Anxiang Ma (4 papers)
  12. Jingbo Zhu (79 papers)
Citations (9)