Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Continual Entity Learning in Language Models for Conversational Agents (2108.00082v2)

Published 30 Jul 2021 in cs.CL and cs.AI

Abstract: Neural LLMs (LM) trained on diverse corpora are known to work well on previously seen entities, however, updating these models with dynamically changing entities such as place names, song titles and shopping items requires re-training from scratch and collecting full sentences containing these entities. We aim to address this issue, by introducing entity-aware LLMs (EALM), where we integrate entity models trained on catalogues of entities into the pre-trained LMs. Our combined LLM adaptively adds information from the entity models into the pre-trained LM depending on the sentence context. Our entity models can be updated independently of the pre-trained LM, enabling us to influence the distribution of entities output by the final LM, without any further training of the pre-trained LM. We show significant perplexity improvements on task-oriented dialogue datasets, especially on long-tailed utterances, with an ability to continually adapt to new entities (to an extent).

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Ravi Teja Gadde (6 papers)
  2. Ivan Bulyko (23 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.