Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cross Copy Network for Dialogue Generation (2010.11539v1)

Published 22 Oct 2020 in cs.CL

Abstract: In the past few years, audiences from different fields witness the achievements of sequence-to-sequence models (e.g., LSTM+attention, Pointer Generator Networks, and Transformer) to enhance dialogue content generation. While content fluency and accuracy often serve as the major indicators for model training, dialogue logics, carrying critical information for some particular domains, are often ignored. Take customer service and court debate dialogue as examples, compatible logics can be observed across different dialogue instances, and this information can provide vital evidence for utterance generation. In this paper, we propose a novel network architecture - Cross Copy Networks(CCN) to explore the current dialog context and similar dialogue instances' logical structure simultaneously. Experiments with two tasks, court debate and customer service content generation, proved that the proposed algorithm is superior to existing state-of-art content generation models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Changzhen Ji (3 papers)
  2. Xin Zhou (319 papers)
  3. Yating Zhang (21 papers)
  4. Xiaozhong Liu (71 papers)
  5. Changlong Sun (37 papers)
  6. Conghui Zhu (20 papers)
  7. Tiejun Zhao (70 papers)
Citations (11)