Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Discourse Modeling of Conversations (1607.04576v1)

Published 15 Jul 2016 in cs.CL and cs.NE

Abstract: Deep neural networks have shown recent promise in many language-related tasks such as the modeling of conversations. We extend RNN-based sequence to sequence models to capture the long range discourse across many turns of conversation. We perform a sensitivity analysis on how much additional context affects performance, and provide quantitative and qualitative evidence that these models are able to capture discourse relationships across multiple utterances. Our results quantifies how adding an additional RNN layer for modeling discourse improves the quality of output utterances and providing more of the previous conversation as input also improves performance. By searching the generated outputs for specific discourse markers we show how neural discourse models can exhibit increased coherence and cohesion in conversations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. John M. Pierre (1 paper)
  2. Mark Butler (2 papers)
  3. Jacob Portnoff (1 paper)
  4. Luis Aguilar (14 papers)
Citations (2)