Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LoRA-Whisper: Parameter-Efficient and Extensible Multilingual ASR (2406.06619v1)

Published 7 Jun 2024 in eess.AS, cs.AI, and cs.CL

Abstract: Recent years have witnessed significant progress in multilingual automatic speech recognition (ASR), driven by the emergence of end-to-end (E2E) models and the scaling of multilingual datasets. Despite that, two main challenges persist in multilingual ASR: language interference and the incorporation of new languages without degrading the performance of the existing ones. This paper proposes LoRA-Whisper, which incorporates LoRA matrix into Whisper for multilingual ASR, effectively mitigating language interference. Furthermore, by leveraging LoRA and the similarities between languages, we can achieve better performance on new languages while upholding consistent performance on original ones. Experiments on a real-world task across eight languages demonstrate that our proposed LoRA-Whisper yields a relative gain of 18.5% and 23.0% over the baseline system for multilingual ASR and language expansion respectively.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Zheshu Song (4 papers)
  2. Jianheng Zhuo (7 papers)
  3. Yifan Yang (578 papers)
  4. Ziyang Ma (73 papers)
  5. Shixiong Zhang (11 papers)
  6. Xie Chen (166 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.