Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Embarrassingly Simple Model for Dialogue Relation Extraction (2012.13873v2)

Published 27 Dec 2020 in cs.CL

Abstract: Dialogue relation extraction (RE) is to predict the relation type of two entities mentioned in a dialogue. In this paper, we propose a simple yet effective model named SimpleRE for the RE task. SimpleRE captures the interrelations among multiple relations in a dialogue through a novel input format named BERT Relation Token Sequence (BRS). In BRS, multiple [CLS] tokens are used to capture possible relations between different pairs of entities mentioned in the dialogue. A Relation Refinement Gate (RRG) is then designed to extract relation-specific semantic representation in an adaptive manner. Experiments on the DialogRE dataset show that SimpleRE achieves the best performance, with much shorter training time. Further, SimpleRE outperforms all direct baselines on sentence-level RE without using external resources.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Fuzhao Xue (24 papers)
  2. Aixin Sun (99 papers)
  3. Hao Zhang (948 papers)
  4. Jinjie Ni (18 papers)
  5. Eng Siong Chng (112 papers)
Citations (25)

Summary

We haven't generated a summary for this paper yet.