Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PLATO-K: Internal and External Knowledge Enhanced Dialogue Generation (2211.00910v1)

Published 2 Nov 2022 in cs.CL

Abstract: Recently, the practical deployment of open-domain dialogue systems has been plagued by the knowledge issue of information deficiency and factual inaccuracy. To this end, we introduce PLATO-K based on two-stage dialogic learning to strengthen internal knowledge memorization and external knowledge exploitation. In the first stage, PLATO-K learns through massive dialogue corpora and memorizes essential knowledge into model parameters. In the second stage, PLATO-K mimics human beings to search for external information and to leverage the knowledge in response generation. Extensive experiments reveal that the knowledge issue is alleviated significantly in PLATO-K with such comprehensive internal and external knowledge enhancement. Compared to the existing state-of-the-art Chinese dialogue model, the overall engagingness of PLATO-K is improved remarkably by 36.2% and 49.2% on chit-chat and knowledge-intensive conversations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Siqi Bao (21 papers)
  2. Huang He (14 papers)
  3. Jun Xu (398 papers)
  4. Hua Lu (27 papers)
  5. Fan Wang (313 papers)
  6. Hua Wu (191 papers)
  7. Han Zhou (72 papers)
  8. Wenquan Wu (12 papers)
  9. Zheng-Yu Niu (10 papers)
  10. Haifeng Wang (194 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.