Scalable and Efficient Temporal Graph Representation Learning via Forward Recent Sampling (2402.01964v2)
Abstract: Temporal graph representation learning (TGRL) is essential for modeling dynamic systems in real-world networks. However, traditional TGRL methods, despite their effectiveness, often face significant computational challenges and inference delays due to the inefficient sampling of temporal neighbors. Conventional sampling methods typically involve backtracking through the interaction history of each node. In this paper, we propose a novel TGRL framework, No-Looking-Back (NLB), which overcomes these challenges by introducing a forward recent sampling strategy. This strategy eliminates the need to backtrack through historical interactions by utilizing a GPU-executable, size-constrained hash table for each node. The hash table records a down-sampled set of recent interactions, enabling rapid query responses with minimal inference latency. The maintenance of this hash table is highly efficient, operating with $O(1)$ complexity. Fully compatible with GPU processing, NLB maximizes programmability, parallelism, and power efficiency. Empirical evaluations demonstrate that NLB not only matches or surpasses state-of-the-art methods in accuracy for tasks like link prediction and node classification across six real-world datasets but also achieves 1.32-4.40x faster training, 1.2-7.94x greater energy efficiency, and 1.63-12.95x lower inference latency compared to competitive baselines. The link to the code: https://github.com/Graph-COM/NLB.
- Node classification in social networks. Social Network Data Analytics (Springer), 2011.
- Deep anomaly detection on attributed networks. In Proceedings of the 2019 SIAM International Conference on Data Mining, pp. 594–602, 2019.
- Dynamic network embedding: An extended approach for skip-gram based network embedding. In IJCAI, 2018.
- Inductive representation learning on large graphs. In NeurIPS, 2017.
- Ogb-lsc: A large-scale challenge for machine learning on graphs. arXiv preprint arXiv:2103.09430, 2021.
- Adaptive sampling towards fast graph representation learning. In NeurIPS, pp. 4563–4572, 2018.
- Time2vec: Learning a vector representation of time. arXiv preprint arXiv:1907.05321, 2019a.
- Representation learning for dynamic graphs: A survey. JMLR, 21:70:1–70:73, 2019b. URL https://api.semanticscholar.org/CorpusID:216608194.
- Semi-supervised classification with graph convolutional networks. In ICLR, 2017.
- Koren, Y. Collaborative filtering with temporal dynamics. In KDD, pp. 447–456, 2009.
- Predicting dynamic embedding trajectory in temporal interaction networks. In KDD, 2019.
- Lavenberg, S. Computer performance modeling handbook. Elsevier, 1983.
- Gdelt: Global data on events, location, and tone. ISA Annual Convention, 2013. URL http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.686.6605.
- Radar: Residual analysis for anomaly detection in attributed networks. In IJCAI, pp. 2152–2158, 2017.
- Distance encoding: Design provably more powerful neural networks for graph representation learning. In NeurIPS, 2020.
- F-fade: Frequency factorization for anomaly detection in edge streams. In WSDM, 2021.
- The link-prediction problem for social networks. Journal of the American society for information science and technology, 58(7), 2007.
- Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692, 2019.
- Neural predicting higher-order patterns in temporal networks. In WWW, 2022.
- Neighborhood-aware scalable temporal network representation learning. In LoG, 2022.
- dynnode2vec: Scalable dynamic network embedding. In International Conference on Big Data (Big Data). IEEE, 2018.
- Continuous-time dynamic network embeddings. In WWW, 2018.
- Anomaly detection in dynamic networks: a survey. Wiley Interdisciplinary Reviews: Computational Statistics, 7(3):223–247, 2015.
- Temporal graph networks for deep learning on dynamic graphs. In ICML 2020 Workshop on GRL, 2020.
- DySAT: Deep neural representation learning on dynamic graphs via self-attention networks. In WSDM, 2020.
- Node embedding over temporal graphs. In IJCAI, 2019.
- Provably expressive temporal graph networks. In NeurIPS, 2022.
- Know-evolve: deep temporal reasoning for dynamic knowledge graphs. In ICML, 2017.
- Dyrep: Learning representations over dynamic graphs. In ICLR, 2019.
- Graph attention networks. In ICLR, 2018.
- Bipartite dynamic representations for abuse detection. In KDD, pp. 3638–3648, 2021a.
- One-class graph neural networks for anomaly detection in attributed networks. In Neural Comput & Applic, 2021b.
- Apan: Asynchronous propagation attention network for real-time temporal graph embedding. In Proceedings of the 2021 International Conference on Management of Data, pp. 2628–2638, 2021c.
- Inductive representation learning in temporal networks via causal anonymous walks. In ICLR, 2021d.
- Self-attention with functional time representation learning. In NeurIPS, 2019.
- Inductive representation learning on temporal graphs. In ICLR, 2020.
- Graph convolutional neural networks for web-scale recommender systems. In KDD, 2018.
- Higher-order structure based anomaly detection on attributed networks. In 2021 IEEE International Conference on Big Data (Big Data), pp. 2691–2700, 2021.
- Tgl: A general framework for temporal gnn training on billion-scale graphs. In Proceedings of the VLDB Endowment, 2022.
- Dynamic network embedding by modeling triadic closure process. In AAAI, 2018.