Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning from History: Modeling Temporal Knowledge Graphs with Sequential Copy-Generation Networks (2012.08492v2)

Published 15 Dec 2020 in cs.AI, cs.CL, and cs.LG

Abstract: Large knowledge graphs often grow to store temporal facts that model the dynamic relations or interactions of entities along the timeline. Since such temporal knowledge graphs often suffer from incompleteness, it is important to develop time-aware representation learning models that help to infer the missing temporal facts. While the temporal facts are typically evolving, it is observed that many facts often show a repeated pattern along the timeline, such as economic crises and diplomatic activities. This observation indicates that a model could potentially learn much from the known facts appeared in history. To this end, we propose a new representation learning model for temporal knowledge graphs, namely CyGNet, based on a novel timeaware copy-generation mechanism. CyGNet is not only able to predict future facts from the whole entity vocabulary, but also capable of identifying facts with repetition and accordingly predicting such future facts with reference to the known facts in the past. We evaluate the proposed method on the knowledge graph completion task using five benchmark datasets. Extensive experiments demonstrate the effectiveness of CyGNet for predicting future facts with repetition as well as de novo fact prediction.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Cunchao Zhu (1 paper)
  2. Muhao Chen (159 papers)
  3. Changjun Fan (15 papers)
  4. Guangquan Cheng (3 papers)
  5. Yan Zhan (1 paper)
Citations (209)

Summary

Overview of "Learning from History: Modeling Temporal Knowledge Graphs with Sequential Copy-Generation Networks"

This paper introduces CyGNet, a novel representation learning model for Temporal Knowledge Graphs (TKGs). The work addresses the inherent challenge of incompleteness in TKGs, where temporal facts are often missing. TKGs are graphs where each edge represents a factual relationship between entities with timestamps, thus capturing dynamic relations over time. These graphs find applications in diverse domains such as information retrieval, natural language understanding, and social network analysis. Notably, the paper underscores the repetitive nature of many temporal facts—for instance, recurring economic crises or annual events—which can be leveraged for more accurate graph completion.

Key Contributions

  1. Sequential Copy-Generation Networks: CyGNet integrates a copy-generation mechanism into TKG modeling, inspired by methods employed in abstractive summarization. It functions in two modes:
    • Copy Mode: Emulates a mechanism to extract recurring facts from historical data, focusing on entities and events that have appeared in the past.
    • Generation Mode: Capable of predicting new facts without historical precedent.
  2. Empirical Validation: The model is benchmarked against existing temporal models on datasets such as ICEWS14, ICEWS18, GDELT, WIKI, and YAGO. CyGNet demonstrates superior performance, notably achieving significant improvements in the GDELT dataset, where it surpasses the previous state-of-the-art by over 10% in MRR.
  3. Ablation Studies: Through an ablation paper, the paper confirms both inference modes contribute significantly to the model’s success, with the Copy Mode enhancing predictions by leveraging detected recurrence patterns in historical data.

Implications

The primary contribution of CyGNet lies in its ability to enhance the predictive performance of temporal knowledge graph completion tasks by effectively capturing recurrent patterns. By appropriately balancing between copying from historical data and generating new predictions, CyGNet introduces a nuanced approach that mitigates the limitations of previous models which lacked mechanisms for handling repetitions.

Practically, CyGNet’s approach could be immediately impactful in domains that rely heavily on temporal knowledge bases, such as real-time event monitoring and predictive analytics in economic forecasting. Theoretically, this paper enriches the understanding of temporal dynamics in knowledge graphs, potentially guiding future research towards models adept at discerning and utilizing temporal patterns.

Speculative Future Directions

Future research could investigate the robustness of the copy-generation mechanism in other contexts of graph neural networks, particularly for tasks that involve multigraph settings or dynamic graph evolution. Additionally, further exploration into learning strategies that better balance the contributions of historical recurrence versus generative capabilities could yield even more resilient models. Extending this work to include other modalities, such as attributed temporal graphs, would also be an intriguing avenue to explore. Such directions could be significantly beneficial in refining models that aim to unify temporal reasoning with broader, multi-dimensional knowledge representation challenges.