Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lifelong and Interactive Learning of Factual Knowledge in Dialogues (1907.13295v2)

Published 31 Jul 2019 in cs.CL, cs.AI, and cs.HC

Abstract: Dialogue systems are increasingly using knowledge bases (KBs) storing real-world facts to help generate quality responses. However, as the KBs are inherently incomplete and remain fixed during conversation, it limits dialogue systems' ability to answer questions and to handle questions involving entities or relations that are not in the KB. In this paper, we make an attempt to propose an engine for Continuous and Interactive Learning of Knowledge (CILK) for dialogue systems to give them the ability to continuously and interactively learn and infer new knowledge during conversations. With more knowledge accumulated over time, they will be able to learn better and answer more questions. Our empirical evaluation shows that CILK is promising.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Sahisnu Mazumder (21 papers)
  2. Bing Liu (211 papers)
  3. Shuai Wang (466 papers)
  4. Nianzu Ma (6 papers)
Citations (23)