Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion (2305.07912v2)

Published 13 May 2023 in cs.CL and cs.AI

Abstract: Temporal Knowledge graph completion (TKGC) is a crucial task that involves reasoning at known timestamps to complete the missing part of facts and has attracted more and more attention in recent years. Most existing methods focus on learning representations based on graph neural networks while inaccurately extracting information from timestamps and insufficiently utilizing the implied information in relations. To address these problems, we propose a novel TKGC model, namely Pre-trained LLM with Prompts for TKGC (PPT). We convert a series of sampled quadruples into pre-trained LLM inputs and convert intervals between timestamps into different prompts to make coherent sentences with implicit semantic information. We train our model with a masking strategy to convert TKGC task into a masked token prediction task, which can leverage the semantic information in pre-trained LLMs. Experiments on three benchmark datasets and extensive analysis demonstrate that our model has great competitiveness compared to other models with four metrics. Our model can effectively incorporate information from temporal knowledge graphs into the LLMs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Wenjie Xu (29 papers)
  2. Ben Liu (17 papers)
  3. Miao Peng (6 papers)
  4. Xu Jia (57 papers)
  5. Min Peng (32 papers)
Citations (12)

Summary

We haven't generated a summary for this paper yet.