Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Temporal Graph Networks for Deep Learning on Dynamic Graphs (2006.10637v3)

Published 18 Jun 2020 in cs.LG and stat.ML

Abstract: Graph Neural Networks (GNNs) have recently become increasingly popular due to their ability to learn complex systems of relations or interactions arising in a broad spectrum of problems ranging from biology and particle physics to social networks and recommendation systems. Despite the plethora of different models for deep learning on graphs, few approaches have been proposed thus far for dealing with graphs that present some sort of dynamic nature (e.g. evolving features or connectivity over time). In this paper, we present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events. Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient. We furthermore show that several previous models for learning on dynamic graphs can be cast as specific instances of our framework. We perform a detailed ablation study of different components of our framework and devise the best configuration that achieves state-of-the-art performance on several transductive and inductive prediction tasks for dynamic graphs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Emanuele Rossi (20 papers)
  2. Ben Chamberlain (3 papers)
  3. Fabrizio Frasca (20 papers)
  4. Davide Eynard (9 papers)
  5. Federico Monti (16 papers)
  6. Michael Bronstein (77 papers)
Citations (534)

Summary

Temporal Graph Networks for Deep Learning on Dynamic Graphs

The paper "Temporal Graph Networks for Deep Learning on Dynamic Graphs" by Rossi et al. presents a novel approach to the increasingly relevant problem of learning on dynamic graphs, which are characterized by evolving features and connectivity over time. The authors introduce Temporal Graph Networks (TGNs), an adaptable framework designed to efficiently process continuous-time dynamic graphs represented as sequences of time-stamped events.

Core Contributions

The primary contribution of this paper is the development of TGNs, which address the limitations of existing Graph Neural Networks (GNNs) typically designed for static graphs. TGNs incorporate memory modules that allow for the recording of long-term dependencies within a graph's nodes, a feature largely unsupported by many previous models. This memory aspect is crucial for handling dynamic graphs where nodes and edges can evolve continuously and unpredictably.

The authors also demonstrate that several existing models can be interpreted as particular instances of the TGN framework, thereby proving its general applicability. Moreover, they introduce a novel training strategy that maximizes computational efficiency and maintains the sequential integrity of the data.

Methodology

TGNs utilize a combination of memory modules, message functions, and various temporal embedding techniques to generate node embeddings. These embeddings account for the continually evolving nature of graph interactions:

  • Memory Module: Each node in a graph is endowed with a memory vector, capturing its historical context. This memory is crucial for storing long-term dependencies and is updated through interaction events.
  • Temporal Embedding: To address memory staleness, the framework employs temporal graph attention and sum-based embeddings that aggregate information from a node's neighbors. This innovation ensures that node embeddings are consistent with recent interactions and provides the model with enhanced expressive power.
  • Training Strategy: The training involves a batch processing approach, which is aligned with the temporal nature of dynamic graphs. This method ensures memory efficiency and computational scalability, allowing the model to learn effectively from sequences of interactions.

Experimental Results

The TGNs achieve state-of-the-art performance on multiple tasks, such as transductive and inductive prediction, across datasets including Wikipedia, Reddit, and Twitter. The framework significantly outperforms existing methods in terms of both accuracy (e.g., edge prediction and node classification tasks) and computational speed.

Notably, the paper reports that TGNs are up to 30 times faster than previous models like TGAT in terms of processing time per epoch, establishing an impressive benchmark for efficiency.

Implications and Future Directions

The implications of TGNs are substantial for fields that inherently involve dynamic systems, such as social network analysis, recommendation systems, and biological interaction networks. The ability to process graphs with evolving structures and attributes implies more responsive and contextually aware models that can adapt to real-time changes.

Future research could explore the integration of global graph memory to further enhance the model's capabilities in capturing temporal graph transformations. Additionally, expanding the dataset to include a wider variety of dynamic events could lead to even more robust models.

Conclusion

The Temporal Graph Networks framework provides a comprehensive and efficient solution for deep learning on dynamic graphs. Its adaptability to continuous-time scenarios and computational advantages make it a valuable tool for researchers and practitioners dealing with complex, temporally evolving networks. The paper sets a new standard in the domain of dynamic graph processing, paving the way for future innovations in graph-based machine learning.

Youtube Logo Streamline Icon: https://streamlinehq.com