Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

QTSumm: Query-Focused Summarization over Tabular Data (2305.14303v2)

Published 23 May 2023 in cs.CL

Abstract: People primarily consult tables to conduct data analysis or answer specific questions. Text generation systems that can provide accurate table summaries tailored to users' information needs can facilitate more efficient access to relevant data insights. Motivated by this, we define a new query-focused table summarization task, where text generation models have to perform human-like reasoning and analysis over the given table to generate a tailored summary. We introduce a new benchmark named QTSumm for this task, which contains 7,111 human-annotated query-summary pairs over 2,934 tables covering diverse topics. We investigate a set of strong baselines on QTSumm, including text generation, table-to-text generation, and LLMs. Experimental results and manual analysis reveal that the new task presents significant challenges in table-to-text generation for future research. Moreover, we propose a new approach named ReFactor, to retrieve and reason over query-relevant information from tabular data to generate several natural language facts. Experimental results demonstrate that ReFactor can bring improvements to baselines by concatenating the generated facts to the model input. Our data and code are publicly available at https://github.com/yale-nlp/QTSumm.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Yilun Zhao (59 papers)
  2. Zhenting Qi (19 papers)
  3. Linyong Nan (17 papers)
  4. Boyu Mi (5 papers)
  5. Yixin Liu (108 papers)
  6. Weijin Zou (4 papers)
  7. Simeng Han (20 papers)
  8. Ruizhe Chen (32 papers)
  9. Xiangru Tang (62 papers)
  10. Yumo Xu (14 papers)
  11. Dragomir Radev (98 papers)
  12. Arman Cohan (121 papers)