Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

HEGEL: Hypergraph Transformer for Long Document Summarization (2210.04126v1)

Published 9 Oct 2022 in cs.CL

Abstract: Extractive summarization for long documents is challenging due to the extended structured input context. The long-distance sentence dependency hinders cross-sentence relations modeling, the critical step of extractive summarization. This paper proposes HEGEL, a hypergraph neural network for long document summarization by capturing high-order cross-sentence relations. HEGEL updates and learns effective sentence representations with hypergraph transformer layers and fuses different types of sentence dependencies, including latent topics, keywords coreference, and section structure. We validate HEGEL by conducting extensive experiments on two benchmark datasets, and experimental results demonstrate the effectiveness and efficiency of HEGEL.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Haopeng Zhang (32 papers)
  2. Xiao Liu (402 papers)
  3. Jiawei Zhang (529 papers)
Citations (39)

Summary

We haven't generated a summary for this paper yet.