Papers
Topics
Authors
Recent
2000 character limit reached

Dialog Context Language Modeling with Recurrent Neural Networks

Published 15 Jan 2017 in cs.CL | (1701.04056v1)

Abstract: In this work, we propose contextual LLMs that incorporate dialog level discourse information into language modeling. Previous works on contextual LLM treat preceding utterances as a sequence of inputs, without considering dialog interactions. We design recurrent neural network (RNN) based contextual LLMs that specially track the interactions between speakers in a dialog. Experiment results on Switchboard Dialog Act Corpus show that the proposed model outperforms conventional single turn based RNN LLM by 3.3% on perplexity. The proposed models also demonstrate advantageous performance over other competitive contextual LLMs.

Citations (30)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.