Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Pre-trained Sentence Embeddings for Implicit Discourse Relation Classification (2210.11005v1)

Published 20 Oct 2022 in cs.CL and cs.AI

Abstract: Implicit discourse relations bind smaller linguistic units into coherent texts. Automatic sense prediction for implicit relations is hard, because it requires understanding the semantics of the linked arguments. Furthermore, annotated datasets contain relatively few labeled examples, due to the scale of the phenomenon: on average each discourse relation encompasses several dozen words. In this paper, we explore the utility of pre-trained sentence embeddings as base representations in a neural network for implicit discourse relation sense classification. We present a series of experiments using both supervised end-to-end trained models and pre-trained sentence encoding techniques - SkipThought, Sent2vec and Infersent. The pre-trained embeddings are competitive with the end-to-end model, and the approaches are complementary, with combined models yielding significant performance improvements on two of the three evaluations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Murali Raghu Babu Balusu (2 papers)
  2. Yangfeng Ji (59 papers)
  3. Jacob Eisenstein (73 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.