Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Non-Autoregressive Transformer ASR with CTC-Enhanced Decoder Input (2010.15025v2)

Published 28 Oct 2020 in cs.SD, cs.CL, and eess.AS

Abstract: Non-autoregressive (NAR) transformer models have achieved significantly inference speedup but at the cost of inferior accuracy compared to autoregressive (AR) models in automatic speech recognition (ASR). Most of the NAR transformers take a fixed-length sequence filled with MASK tokens or a redundant sequence copied from encoder states as decoder input, they cannot provide efficient target-side information thus leading to accuracy degradation. To address this problem, we propose a CTC-enhanced NAR transformer, which generates target sequence by refining predictions of the CTC module. Experimental results show that our method outperforms all previous NAR counterparts and achieves 50x faster decoding speed than a strong AR baseline with only 0.0 ~ 0.3 absolute CER degradation on Aishell-1 and Aishell-2 datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xingchen Song (18 papers)
  2. Zhiyong Wu (171 papers)
  3. Yiheng Huang (12 papers)
  4. Chao Weng (61 papers)
  5. Dan Su (101 papers)
  6. Helen Meng (204 papers)
Citations (34)