Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FusionFormer: Fusing Operations in Transformer for Efficient Streaming Speech Recognition (2210.17079v1)

Published 31 Oct 2022 in cs.SD, cs.CL, and eess.AS

Abstract: The recently proposed Conformer architecture which combines convolution with attention to capture both local and global dependencies has become the \textit{de facto} backbone model for Automatic Speech Recognition~(ASR). Inherited from the NLP tasks, the architecture takes Layer Normalization~(LN) as a default normalization technique. However, through a series of systematic studies, we find that LN might take 10\% of the inference time despite that it only contributes to 0.1\% of the FLOPs. This motivates us to replace LN with other normalization techniques, e.g., Batch Normalization~(BN), to speed up inference with the help of operator fusion methods and the avoidance of calculating the mean and variance statistics during inference. After examining several plain attempts which directly remove all LN layers or replace them with BN in the same place, we find that the divergence issue is mainly caused by the unstable layer output. We therefore propose to append a BN layer to each linear or convolution layer where stabilized training results are observed. We also propose to simplify the activations in Conformer, such as Swish and GLU, by replacing them with ReLU. All these exchanged modules can be fused into the weights of the adjacent linear/convolution layers and hence have zero inference cost. Therefore, we name it FusionFormer. Our experiments indicate that FusionFormer is as effective as the LN-based Conformer and is about 10\% faster.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Xingchen Song (18 papers)
  2. Di Wu (478 papers)
  3. Binbin Zhang (47 papers)
  4. Zhiyong Wu (171 papers)
  5. Wenpeng Li (7 papers)
  6. Dongfang Li (46 papers)
  7. Pengshen Zhang (2 papers)
  8. Zhendong Peng (20 papers)
  9. Fuping Pan (11 papers)
  10. Changbao Zhu (6 papers)
  11. Zhongqin Wu (25 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.