Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Topic-Guided Abstractive Text Summarization: a Joint Learning Approach (2010.10323v2)

Published 20 Oct 2020 in cs.CL

Abstract: We introduce a new approach for abstractive text summarization, Topic-Guided Abstractive Summarization, which calibrates long-range dependencies from topic-level features with globally salient content. The idea is to incorporate neural topic modeling with a Transformer-based sequence-to-sequence (seq2seq) model in a joint learning framework. This design can learn and preserve the global semantics of the document, which can provide additional contextual guidance for capturing important ideas of the document, thereby enhancing the generation of summary. We conduct extensive experiments on two datasets and the results show that our proposed model outperforms many extractive and abstractive systems in terms of both ROUGE measurements and human evaluation. Our code is available at: https://github.com/chz816/tas.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Chujie Zheng (35 papers)
  2. Kunpeng Zhang (31 papers)
  3. Harry Jiannan Wang (6 papers)
  4. Ling Fan (8 papers)
  5. Zhe Wang (574 papers)
Citations (5)
Github Logo Streamline Icon: https://streamlinehq.com