Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Three-Stage Learning Framework for Low-Resource Knowledge-Grounded Dialogue Generation (2109.04096v1)

Published 9 Sep 2021 in cs.CL and cs.AI

Abstract: Neural conversation models have shown great potentials towards generating fluent and informative responses by introducing external background knowledge. Nevertheless, it is laborious to construct such knowledge-grounded dialogues, and existing models usually perform poorly when transfer to new domains with limited training samples. Therefore, building a knowledge-grounded dialogue system under the low-resource setting is a still crucial issue. In this paper, we propose a novel three-stage learning framework based on weakly supervised learning which benefits from large scale ungrounded dialogues and unstructured knowledge base. To better cooperate with this framework, we devise a variant of Transformer with decoupled decoder which facilitates the disentangled learning of response generation and knowledge incorporation. Evaluation results on two benchmarks indicate that our approach can outperform other state-of-the-art methods with less training data, and even in zero-resource scenario, our approach still performs well.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shilei Liu (18 papers)
  2. Xiaofeng Zhao (22 papers)
  3. Bochao Li (11 papers)
  4. Feiliang Ren (18 papers)
  5. Longhui Zhang (9 papers)
  6. Shujuan Yin (5 papers)
Citations (30)