Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
72 tokens/sec
GPT-4o
61 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Neural Conversation Generation Model via Equivalent Shared Memory Investigation (2108.09164v1)

Published 20 Aug 2021 in cs.CL

Abstract: Conversation generation as a challenging task in Natural Language Generation (NLG) has been increasingly attracting attention over the last years. A number of recent works adopted sequence-to-sequence structures along with external knowledge, which successfully enhanced the quality of generated conversations. Nevertheless, few works utilized the knowledge extracted from similar conversations for utterance generation. Taking conversations in customer service and court debate domains as examples, it is evident that essential entities/phrases, as well as their associated logic and inter-relationships can be extracted and borrowed from similar conversation instances. Such information could provide useful signals for improving conversation generation. In this paper, we propose a novel reading and memory framework called Deep Reading Memory Network (DRMN) which is capable of remembering useful information of similar conversations for improving utterance generation. We apply our model to two large-scale conversation datasets of justice and e-commerce fields. Experiments prove that the proposed model outperforms the state-of-the-art approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Changzhen Ji (3 papers)
  2. Yating Zhang (21 papers)
  3. Xiaozhong Liu (71 papers)
  4. Adam Jatowt (57 papers)
  5. Changlong Sun (37 papers)
  6. Conghui Zhu (20 papers)
  7. Tiejun Zhao (70 papers)
Citations (1)