2000 character limit reached
Applying a Generic Sequence-to-Sequence Model for Simple and Effective Keyphrase Generation (2201.05302v1)
Published 14 Jan 2022 in cs.CL and cs.AI
Abstract: In recent years, a number of keyphrase generation (KPG) approaches were proposed consisting of complex model architectures, dedicated training paradigms and decoding strategies. In this work, we opt for simplicity and show how a commonly used seq2seq LLM, BART, can be easily adapted to generate keyphrases from the text in a single batch computation using a simple training procedure. Empirical results on five benchmarks show that our approach is as good as the existing state-of-the-art KPG systems, but using a much simpler and easy to deploy framework.
- Md Faisal Mahbub Chowdhury (11 papers)
- Gaetano Rossiello (21 papers)
- Michael Glass (20 papers)
- Nandana Mihindukulasooriya (26 papers)
- Alfio Gliozzo (28 papers)