Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Restructuring Conversations using Discourse Relations for Zero-shot Abstractive Dialogue Summarization (1902.01615v2)

Published 5 Feb 2019 in cs.CL

Abstract: Dialogue summarization is a challenging problem due to the informal and unstructured nature of conversational data. Recent advances in abstractive summarization have been focused on data-hungry neural models and adapting these models to a new domain requires the availability of domain-specific manually annotated corpus created by linguistic experts. We propose a zero-shot abstractive dialogue summarization method that uses discourse relations to provide structure to conversations, and then uses an out-of-the-box document summarization model to create final summaries. Experiments on the AMI and ICSI meeting corpus, with document summarization models like PGN and BART, shows that our method improves the ROGUE score by up to 3 points, and even performs competitively against other state-of-the-art methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Prakhar Ganesh (15 papers)
  2. Saket Dingliwal (22 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.