Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Few-Shot Table-to-Text Generation with Prompt Planning and Knowledge Memorization (2302.04415v3)

Published 9 Feb 2023 in cs.CL and cs.AI

Abstract: Pre-trained LLMs (PLM) have achieved remarkable advancement in table-to-text generation tasks. However, the lack of labeled domain-specific knowledge and the topology gap between tabular data and text make it difficult for PLMs to yield faithful text. Low-resource generation likewise faces unique challenges in this domain. Inspired by how humans descript tabular data with prior knowledge, we suggest a new framework: PromptMize, which targets table-to-text generation under few-shot settings. The design of our framework consists of two aspects: a prompt planner and a knowledge adapter. The prompt planner aims to generate a prompt signal that provides instance guidance for PLMs to bridge the topology gap between tabular data and text. Moreover, the knowledge adapter memorizes domain-specific knowledge from the unlabelled corpus to supply essential information during generation. Extensive experiments and analyses are investigated on three open domain few-shot NLG datasets: human, song, and book. Compared with previous state-of-the-art approaches, our model achieves remarkable performance in generating quality as judged by human and automatic evaluations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Zhixin Guo (8 papers)
  2. Minyxuan Yan (2 papers)
  3. Jiexing Qi (9 papers)
  4. Jianping Zhou (9 papers)
  5. Ziwei He (13 papers)
  6. Zhouhan Lin (57 papers)
  7. Guanjie Zheng (37 papers)
  8. Xinbing Wang (98 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.