2000 character limit reached
Paraphrasing with Large Language Models (1911.09661v1)
Published 21 Nov 2019 in cs.CL and cs.LG
Abstract: Recently, LLMs such as GPT-2 have shown themselves to be extremely adept at text generation and have also been able to achieve high-quality results in many downstream NLP tasks such as text classification, sentiment analysis and question answering with the aid of fine-tuning. We present a useful technique for using a LLM to perform the task of paraphrasing on a variety of texts and subjects. Our approach is demonstrated to be capable of generating paraphrases not only at a sentence level but also for longer spans of text such as paragraphs without needing to break the text into smaller chunks.
- Sam Witteveen (12 papers)
- Martin Andrews (15 papers)