Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Effectiveness of Neural Text Generation based Data Augmentation for Recognition of Morphologically Rich Speech (2006.05129v1)

Published 9 Jun 2020 in eess.AS, cs.CL, and cs.SD

Abstract: Advanced neural network models have penetrated Automatic Speech Recognition (ASR) in recent years, however, in LLMing many systems still rely on traditional Back-off N-gram LLMs (BNLM) partly or entirely. The reason for this are the high cost and complexity of training and using neural LLMs, mostly possible by adding a second decoding pass (rescoring). In our recent work we have significantly improved the online performance of a conversational speech transcription system by transferring knowledge from a Recurrent Neural Network LLM (RNNLM) to the single pass BNLM with text generation based data augmentation. In the present paper we analyze the amount of transferable knowledge and demonstrate that the neural augmented LM (RNN-BNLM) can help to capture almost 50% of the knowledge of the RNNLM yet by dropping the second decoding pass and making the system real-time capable. We also systematically compare word and subword LMs and show that subword-based neural text augmentation can be especially beneficial in under-resourced conditions. In addition, we show that using the RNN-BNLM in the first pass followed by a neural second pass, offline ASR results can be even significantly improved.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Balázs Tarján (3 papers)
  2. György Szaszák (4 papers)
  3. Tibor Fegyó (3 papers)
  4. Péter Mihajlik (3 papers)
Citations (2)