Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GLiNER multi-task: Generalist Lightweight Model for Various Information Extraction Tasks (2406.12925v2)

Published 14 Jun 2024 in cs.LG, cs.AI, cs.CL, and cs.IR

Abstract: Information extraction tasks require both accurate, efficient, and generalisable models. Classical supervised deep learning approaches can achieve the required performance, but they need large datasets and are limited in their ability to adapt to different tasks. On the other hand, LLMs demonstrate good generalization, meaning that they can adapt to many different tasks based on user requests. However, LLMs are computationally expensive and tend to fail to generate structured outputs. In this article, we will introduce a new kind of GLiNER model that can be used for various information extraction tasks while being a small encoder model. Our model achieved SoTA performance on zero-shot NER benchmarks and leading performance on question-answering, summarization and relation extraction tasks. Additionally, in this article, we will cover experimental results on self-learning approaches for named entity recognition using GLiNER models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Ihor Stepanov (2 papers)
  2. Mykhailo Shtopko (1 paper)
Citations (1)
X Twitter Logo Streamline Icon: https://streamlinehq.com