Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Cascade Dual-Decoder Model for Joint Entity and Relation Extraction (2106.14163v2)

Published 27 Jun 2021 in cs.CL and cs.AI

Abstract: In knowledge graph construction, a challenging issue is how to extract complex (e.g., overlapping) entities and relationships from a small amount of unstructured historical data. The traditional pipeline methods are to divide the extraction into two separate subtasks, which misses the potential interaction between the two subtasks and may lead to error propagation. In this work, we propose an effective cascade dual-decoder method to extract overlapping relational triples, which includes a text-specific relation decoder and a relation-corresponded entity decoder. Our approach is straightforward and it includes a text-specific relation decoder and a relation-corresponded entity decoder. The text-specific relation decoder detects relations from a sentence at the text level. That is, it does this according to the semantic information of the whole sentence. For each extracted relation, which is with trainable embedding, the relation-corresponded entity decoder detects the corresponding head and tail entities using a span-based tagging scheme. In this way, the overlapping triple problem can be tackled naturally. We conducted experiments on a real-world open-pit mine dataset and two public datasets to verify the method's generalizability. The experimental results demonstrate the effectiveness and competitiveness of our proposed method and achieve better F1 scores under strict evaluation metrics. Our implementation is available at https://github.com/prastunlp/DualDec.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Lianbo Ma (9 papers)
  2. Huimin Ren (3 papers)
  3. Xiliang Zhang (11 papers)
  4. Jian Cheng (127 papers)
  5. Tian Zhang (54 papers)
  6. Shuang Zhang (132 papers)
  7. Guo Yu (34 papers)
  8. Shangce Gao (8 papers)
Citations (16)