2000 character limit reached
Dialog Context Language Modeling with Recurrent Neural Networks (1701.04056v1)
Published 15 Jan 2017 in cs.CL
Abstract: In this work, we propose contextual LLMs that incorporate dialog level discourse information into LLMing. Previous works on contextual LLM treat preceding utterances as a sequence of inputs, without considering dialog interactions. We design recurrent neural network (RNN) based contextual LLMs that specially track the interactions between speakers in a dialog. Experiment results on Switchboard Dialog Act Corpus show that the proposed model outperforms conventional single turn based RNN LLM by 3.3% on perplexity. The proposed models also demonstrate advantageous performance over other competitive contextual LLMs.