Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Sequence Encoders for Temporal Knowledge Graph Completion (1809.03202v1)

Published 10 Sep 2018 in cs.AI and cs.CL

Abstract: Research on link prediction in knowledge graphs has mainly focused on static multi-relational data. In this work we consider temporal knowledge graphs where relations between entities may only hold for a time interval or a specific point in time. In line with previous work on static knowledge graphs, we propose to address this problem by learning latent entity and relation type representations. To incorporate temporal information, we utilize recurrent neural networks to learn time-aware representations of relation types which can be used in conjunction with existing latent factorization methods. The proposed approach is shown to be robust to common challenges in real-world KGs: the sparsity and heterogeneity of temporal expressions. Experiments show the benefits of our approach on four temporal KGs. The data sets are available under a permissive BSD-3 license 1.

Citations (357)

Summary

  • The paper introduces RNN-based sequence encoders using LSTMs to integrate time-specific data into knowledge graph predictions.
  • It develops TA-TransE and TA-distMult models that significantly outperform traditional non-temporal baselines on metrics like MRR and Hits@N.
  • The approach effectively generalizes to unseen timestamps, addressing limitations of conventional methods reliant on static temporal embeddings.

Learning Sequence Encoders for Temporal Knowledge Graph Completion

The paper "Learning Sequence Encoders for Temporal Knowledge Graph Completion" presents a novel approach to address the challenges inherent in link prediction within temporal knowledge graphs (KGs). Temporal KGs, unlike their static counterparts, involve entity relationships that may be valid only within certain time intervals or at specific timestamps. Traditional approaches often neglect the temporal aspects, limiting their applicability to dynamic datasets. This paper introduces the use of sequence encoders to incorporate temporal information effectively, enhancing the predictive accuracy in temporal KGs.

Methodology

The authors propose a model that leverages recurrent neural networks (RNNs), particularly Long Short-Term Memory networks (LSTMs), to learn time-aware representations from sequences of temporal tokens. These sequences capture temporal nuances by encoding time-related data such as timestamps and modifiers (e.g., "since," "until"). The model combines these temporal encodings with standard entity and relation embeddings, integrating them into established latent factorization frameworks.

Two primary implementations of this approach are introduced: TA-TransE (Temporal-aware TransE) and TA-distMult. In TA-TransE, the embeddings are adjusted through translations akin to the original TransE method, but with temporal information. Similarly, TA-distMult extends the distMult model with sequence-encoded temporal data.

Experimental Results

The research evaluates the proposed methods on four distinct temporal KG datasets: YAGO15k, ICEWS '14, ICEWS 05-15, and Wikidata. These datasets encapsulate diverse temporal characteristics, offering a broader test scope for the approach.

The results demonstrate robust improvements over non-temporal baseline models, including TransE and distMult, indicating the effectiveness of temporal awareness. Specifically, TA-TransE and TA-distMult show significant gains in metrics such as Mean Reciprocal Rank (MRR) and Hits@N. The improvements are most pronounced on datasets with rich temporal data, where traditional models like TTransE falter due to their inability to handle unseen timestamps effectively.

Implications and Future Directions

The paper's contributions hold substantial implications for the field of temporal KG completion. By employing token-based encoding of temporal information, the methodology facilitates efficient parameter sharing and supports generalization to unseen timestamps—advantages over conventional models reliant on pre-stored timestamp embeddings.

The research opens several avenues for future work. Further exploration may involve refining the sequence encoding architecture or integrating more advanced RNN variants. Additionally, expanding the application to real-world, large-scale temporal datasets could validate the model's scalability and adaptability in a broader array of contexts.

Conclusion

This paper provides a valuable step forward in enhancing temporal knowledge representation. By embedding temporal dynamics into KGs, the research offers a methodology that mitigates the sparsity and heterogeneity challenges of real-world temporal datasets. The findings present a compelling case for the integration of temporal encodings within KG completion tasks, setting a foundation for subsequent innovations in the domain.

Github Logo Streamline Icon: https://streamlinehq.com