Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LSTM Language Models for LVCSR in First-Pass Decoding and Lattice-Rescoring (1907.01030v1)

Published 1 Jul 2019 in eess.AS, cs.LG, cs.SD, and stat.ML

Abstract: LSTM based LLMs are an important part of modern LVCSR systems as they significantly improve performance over traditional backoff LLMs. Incorporating them efficiently into decoding has been notoriously difficult. In this paper we present an approach based on a combination of one-pass decoding and lattice rescoring. We perform decoding with the LSTM-LM in the first pass but recombine hypothesis that share the last two words, afterwards we rescore the resulting lattice. We run our systems on GPGPU equipped machines and are able to produce competitive results on the Hub5'00 and Librispeech evaluation corpora with a runtime better than real-time. In addition we shortly investigate the possibility to carry out the full sum over all state-sequences belonging to a given word-hypothesis during decoding without recombination.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Eugen Beck (9 papers)
  2. Wei Zhou (311 papers)
  3. Ralf Schlüter (73 papers)
  4. Hermann Ney (104 papers)
Citations (34)

Summary

We haven't generated a summary for this paper yet.