Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

zrLLM: Zero-Shot Relational Learning on Temporal Knowledge Graphs with Large Language Models (2311.10112v2)

Published 15 Nov 2023 in cs.AI, cs.CL, and cs.LG

Abstract: Modeling evolving knowledge over temporal knowledge graphs (TKGs) has become a heated topic. Various methods have been proposed to forecast links on TKGs. Most of them are embedding-based, where hidden representations are learned to represent knowledge graph (KG) entities and relations based on the observed graph contexts. Although these methods show strong performance on traditional TKG forecasting (TKGF) benchmarks, they face a strong challenge in modeling the unseen zero-shot relations that have no prior graph context. In this paper, we try to mitigate this problem as follows. We first input the text descriptions of KG relations into LLMs for generating relation representations, and then introduce them into embedding-based TKGF methods. LLM-empowered representations can capture the semantic information in the relation descriptions. This makes the relations, whether seen or unseen, with similar semantic meanings stay close in the embedding space, enabling TKGF models to recognize zero-shot relations even without any observed graph context. Experimental results show that our approach helps TKGF models to achieve much better performance in forecasting the facts with previously unseen relations, while still maintaining their ability in link forecasting regarding seen relations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Zifeng Ding (26 papers)
  2. Heling Cai (1 paper)
  3. Jingpei Wu (5 papers)
  4. Yunpu Ma (57 papers)
  5. Ruotong Liao (8 papers)
  6. Bo Xiong (84 papers)
  7. Volker Tresp (158 papers)
Citations (6)