Retrieval-Free Knowledge-Grounded Dialogue Response Generation with Adapters (2105.06232v5)
Abstract: To diversify and enrich generated dialogue responses, knowledge-grounded dialogue has been investigated in recent years. The existing methods tackle the knowledge grounding challenge by retrieving the relevant sentences over a large corpus and augmenting the dialogues with explicit extra information. Despite their success, however, the existing works have drawbacks in inference efficiency. This paper proposes KnowExpert, a framework to bypass the explicit retrieval process and inject knowledge into the pre-trained LLMs with lightweight adapters and adapt to the knowledge-grounded dialogue task. To the best of our knowledge, this is the first attempt to tackle this challenge without retrieval in this task under an open-domain chit-chat scenario. The experimental results show that Knowexpert performs comparably with some retrieval-based baselines while being time-efficient in inference, demonstrating the effectiveness of our proposed method.
- Yan Xu (258 papers)
- Etsuko Ishii (18 papers)
- Samuel Cahyawijaya (75 papers)
- Zihan Liu (102 papers)
- Genta Indra Winata (94 papers)
- Andrea Madotto (64 papers)
- Dan Su (101 papers)
- Pascale Fung (150 papers)