Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Importance of Word and Sentence Representation Learning in Implicit Discourse Relation Classification (2004.12617v2)

Published 27 Apr 2020 in cs.CL

Abstract: Implicit discourse relation classification is one of the most difficult parts in shallow discourse parsing as the relation prediction without explicit connectives requires the language understanding at both the text span level and the sentence level. Previous studies mainly focus on the interactions between two arguments. We argue that a powerful contextualized representation module, a bilateral multi-perspective matching module, and a global information fusion module are all important to implicit discourse analysis. We propose a novel model to combine these modules together. Extensive experiments show that our proposed model outperforms BERT and other state-of-the-art systems on the PDTB dataset by around 8% and CoNLL 2016 datasets around 16%. We also analyze the effectiveness of different modules in the implicit discourse relation classification task and demonstrate how different levels of representation learning can affect the results.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Xin Liu (820 papers)
  2. Jiefu Ou (9 papers)
  3. Yangqiu Song (196 papers)
  4. Xin Jiang (242 papers)
Citations (28)

Summary

We haven't generated a summary for this paper yet.