Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dialog Context Language Modeling with Recurrent Neural Networks (1701.04056v1)

Published 15 Jan 2017 in cs.CL

Abstract: In this work, we propose contextual LLMs that incorporate dialog level discourse information into LLMing. Previous works on contextual LLM treat preceding utterances as a sequence of inputs, without considering dialog interactions. We design recurrent neural network (RNN) based contextual LLMs that specially track the interactions between speakers in a dialog. Experiment results on Switchboard Dialog Act Corpus show that the proposed model outperforms conventional single turn based RNN LLM by 3.3% on perplexity. The proposed models also demonstrate advantageous performance over other competitive contextual LLMs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Bing Liu (211 papers)
  2. Ian Lane (29 papers)
Citations (30)