Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Sentence-Level Relation Extraction through Curriculum Learning (2107.09332v2)

Published 20 Jul 2021 in cs.CL and cs.AI

Abstract: Sentence-level relation extraction mainly aims to classify the relation between two entities in a sentence. The sentence-level relation extraction corpus often contains data that are difficult for the model to infer or noise data. In this paper, we propose a curriculum learning-based relation extraction model that splits data by difficulty and utilizes them for learning. In the experiments with the representative sentence-level relation extraction datasets, TACRED and Re-TACRED, the proposed method obtained an F1-score of 75.0% and 91.4% respectively, which are the state-of-the-art performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Seongsik Park (19 papers)
  2. Harksoo Kim (8 papers)
Citations (12)

Summary

We haven't generated a summary for this paper yet.