Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Few-shot Knowledge Graph-to-Text Generation with Pretrained Language Models (2106.01623v1)

Published 3 Jun 2021 in cs.CL

Abstract: This paper studies how to automatically generate a natural language text that describes the facts in knowledge graph (KG). Considering the few-shot setting, we leverage the excellent capacities of pretrained LLMs (PLMs) in language understanding and generation. We make three major technical contributions, namely representation alignment for bridging the semantic gap between KG encodings and PLMs, relation-biased KG linearization for deriving better input representations, and multi-task learning for learning the correspondence between KG and text. Extensive experiments on three benchmark datasets have demonstrated the effectiveness of our model on KG-to-text generation task. In particular, our model outperforms all comparison methods on both fully-supervised and few-shot settings. Our code and datasets are available at https://github.com/RUCAIBox/Few-Shot-KG2Text.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Junyi Li (92 papers)
  2. Tianyi Tang (30 papers)
  3. Wayne Xin Zhao (196 papers)
  4. Zhicheng Wei (5 papers)
  5. Nicholas Jing Yuan (22 papers)
  6. Ji-Rong Wen (299 papers)
Citations (43)