Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variational Inference-Based Dropout in Recurrent Neural Networks for Slot Filling in Spoken Language Understanding (2009.01003v1)

Published 23 Aug 2020 in cs.CL, cs.SD, and eess.AS

Abstract: This paper proposes to generalize the variational recurrent neural network (RNN) with variational inference (VI)-based dropout regularization employed for the long short-term memory (LSTM) cells to more advanced RNN architectures like gated recurrent unit (GRU) and bi-directional LSTM/GRU. The new variational RNNs are employed for slot filling, which is an intriguing but challenging task in spoken language understanding. The experiments on the ATIS dataset suggest that the variational RNNs with the VI-based dropout regularization can significantly improve the naive dropout regularization RNNs-based baseline systems in terms of F-measure. Particularly, the variational RNN with bi-directional LSTM/GRU obtains the best F-measure score.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Jun Qi (28 papers)
  2. Xu Liu (213 papers)
  3. Javier Tejedor (7 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.