Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Task-specific Pre-training and Prompt Decomposition for Knowledge Graph Population with Language Models (2208.12539v2)

Published 26 Aug 2022 in cs.CL

Abstract: We present a system for knowledge graph population with LLMs, evaluated on the Knowledge Base Construction from Pre-trained LLMs (LM-KBC) challenge at ISWC 2022. Our system involves task-specific pre-training to improve LM representation of the masked object tokens, prompt decomposition for progressive generation of candidate objects, among other methods for higher-quality retrieval. Our system is the winner of track 1 of the LM-KBC challenge, based on BERT LM; it achieves 55.0% F-1 score on the hidden test set of the challenge.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Tianyi Li (84 papers)
  2. Wenyu Huang (7 papers)
  3. Nikos Papasarantopoulos (4 papers)
  4. Pavlos Vougiouklis (11 papers)
  5. Jeff Z. Pan (78 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.