Dialog Context Language Modeling with Recurrent Neural Networks
Abstract: In this work, we propose contextual LLMs that incorporate dialog level discourse information into language modeling. Previous works on contextual LLM treat preceding utterances as a sequence of inputs, without considering dialog interactions. We design recurrent neural network (RNN) based contextual LLMs that specially track the interactions between speakers in a dialog. Experiment results on Switchboard Dialog Act Corpus show that the proposed model outperforms conventional single turn based RNN LLM by 3.3% on perplexity. The proposed models also demonstrate advantageous performance over other competitive contextual LLMs.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.