2000 character limit reached
On-The-Fly Information Retrieval Augmentation for Language Models
Published 3 Jul 2020 in cs.CL | (2007.01528v1)
Abstract: Here we experiment with the use of information retrieval as an augmentation for pre-trained LLMs. The text corpus used in information retrieval can be viewed as form of episodic memory which grows over time. By augmenting GPT 2.0 with information retrieval we achieve a zero shot 15% relative reduction in perplexity on Gigaword corpus without any re-training. We also validate our IR augmentation on an event co-reference task.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.