Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multilinguals at SemEval-2022 Task 11: Transformer Based Architecture for Complex NER (2204.02173v1)

Published 5 Apr 2022 in cs.CL and cs.AI

Abstract: We investigate the task of complex NER for the English language. The task is non-trivial due to the semantic ambiguity of the textual structure and the rarity of occurrence of such entities in the prevalent literature. Using pre-trained LLMs such as BERT, we obtain a competitive performance on this task. We qualitatively analyze the performance of multiple architectures for this task. All our models are able to outperform the baseline by a significant margin. Our best performing model beats the baseline F1-score by over 9%.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Amit Pandey (5 papers)
  2. Swayatta Daw (2 papers)
  3. Vikram Pudi (11 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.