Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Point-less: More Abstractive Summarization with Pointer-Generator Networks (1905.01975v1)

Published 18 Apr 2019 in cs.CL

Abstract: The Pointer-Generator architecture has shown to be a big improvement for abstractive summarization seq2seq models. However, the summaries produced by this model are largely extractive as over 30% of the generated sentences are copied from the source text. This work proposes a multihead attention mechanism, pointer dropout, and two new loss functions to promote more abstractive summaries while maintaining similar ROUGE scores. Both the multihead attention and dropout do not improve N-gram novelty, however, the dropout acts as a regularizer which improves the ROUGE score. The new loss function achieves significantly higher novel N-grams and sentences, at the cost of a slightly lower ROUGE score.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Freek Boutkan (1 paper)
  2. Jorn Ranzijn (1 paper)
  3. David Rau (8 papers)
  4. Eelco van der Wel (2 papers)
Citations (6)