Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

RT-KGD: Relation Transition Aware Knowledge-Grounded Dialogue Generation (2207.08212v1)

Published 17 Jul 2022 in cs.CL and cs.AI

Abstract: Grounding dialogue system with external knowledge is a promising way to improve the quality of responses. Most existing works adopt knowledge graphs (KGs) as the external resources, paying attention to the contribution of entities in the last utterance of the dialogue for context understanding and response generation. Nevertheless, the correlations between knowledge implied in the multi-turn context and the transition regularities between relations in KGs are under-explored. To this end, we propose a Relation Transition aware Knowledge-Grounded Dialogue Generation model (RT-KGD). Specifically, inspired by the latent logic of human conversation, our model integrates dialogue-level relation transition regularities with turn-level entity semantic information. In this manner, the interaction between knowledge is considered to produce abundant clues for predicting the appropriate knowledge and generating coherent responses. The experimental results on both automatic evaluation and manual evaluation indicate that our model outperforms state-of-the-art baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Kexin Wang (41 papers)
  2. Zhixu Li (43 papers)
  3. Jiaan Wang (35 papers)
  4. Jianfeng Qu (17 papers)
  5. Ying He (102 papers)
  6. An Liu (91 papers)
  7. Lei Zhao (808 papers)
Citations (3)