Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

From random-walks to graph-sprints: a low-latency node embedding framework on continuous-time dynamic graphs (2307.08433v5)

Published 17 Jul 2023 in cs.LG

Abstract: Many real-world datasets have an underlying dynamic graph structure, where entities and their interactions evolve over time. Machine learning models should consider these dynamics in order to harness their full potential in downstream tasks. Previous approaches for graph representation learning have focused on either sampling k-hop neighborhoods, akin to breadth-first search, or random walks, akin to depth-first search. However, these methods are computationally expensive and unsuitable for real-time, low-latency inference on dynamic graphs. To overcome these limitations, we propose graph-sprints a general purpose feature extraction framework for continuous-time-dynamic-graphs (CTDGs) that has low latency and is competitive with state-of-the-art, higher latency models. To achieve this, a streaming, low latency approximation to the random-walk based features is proposed. In our framework, time-aware node embeddings summarizing multi-hop information are computed using only single-hop operations on the incoming edges. We evaluate our proposed approach on three open-source datasets and two in-house datasets, and compare with three state-of-the-art algorithms (TGN-attn, TGN-ID, Jodie). We demonstrate that our graph-sprints features, combined with a machine learning classifier, achieve competitive performance (outperforming all baselines for the node classification tasks in five datasets). Simultaneously, graph-sprints significantly reduce inference latencies, achieving close to an order of magnitude speed-up in our experimental setting.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. 2623–2631.
  2. Deep coevolutionary network: Embedding user and item features for recommendation. arXiv preprint arXiv:1609.03675 (2016).
  3. Graph neural networks for social recommendation. In The world wide web conference. 417–426.
  4. Matthias Fey and Jan E. Lenssen. 2019. Fast Graph Representation Learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds.
  5. Dyngem: Deep embedding method for dynamic graphs. arXiv preprint arXiv:1805.11273 (2018).
  6. Aditya Grover and Jure Leskovec. 2016. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining. 855–864.
  7. Continuous Temporal Graph Networks for Event-Based Graph Data. arXiv preprint arXiv:2205.15924 (2022).
  8. Node2bits: Compact time-and attribute-aware node representations for user stitching. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, 483–506.
  9. From static to dynamic node embeddings. arXiv preprint arXiv:2009.10017 (2020).
  10. On Generalizing Static Node Embedding to Dynamic Settings. In Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining. 410–420.
  11. Neural Temporal Walks: Motif-Aware Representation Learning on Continuous-Time Dynamic Graphs. In Advances in Neural Information Processing Systems.
  12. Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks. In Proceedings of the 25th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM.
  13. Dynamic node embeddings from edge streams. IEEE Transactions on Emerging Topics in Computational Intelligence 5, 6 (2020), 931–946.
  14. Continuous-time dynamic network embeddings. In Companion proceedings of the the web conference 2018. 969–976.
  15. Deepwalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. 701–710.
  16. Temporal Graph Networks for Deep Learning on Dynamic Graphs. In ICML 2020 Workshop on Graph Representation Learning.
  17. Efficient representation learning using random walks for dynamic graphs. arXiv preprint arXiv:1901.01346 (2019).
  18. Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In Proceedings of the 13th international conference on web search and data mining. 519–527.
  19. Provably expressive temporal graph networks. Advances in Neural Information Processing Systems 35 (2022), 32257–32269.
  20. A review on graph neural network methods in financial applications. arXiv preprint arXiv:2111.15367 (2021).
  21. Hyperbolic node embedding for temporal networks. Data Mining and Knowledge Discovery 35, 5 (2021), 1906–1940.
  22. Apan: Asynchronous propagation attention network for real-time temporal graph embedding. In Proceedings of the 2021 international conference on management of data. 2628–2638.
  23. Inductive representation learning in temporal networks via causal anonymous walks. arXiv preprint arXiv:2101.05974 (2021).
  24. Hashing-accelerated graph neural networks for link prediction. In Proceedings of the Web Conference 2021. 2910–2920.
  25. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems 32, 1 (2020), 4–24.
  26. Inductive representation learning on temporal graphs. arXiv preprint arXiv:2002.07962 (2020).
  27. Streaming graph embeddings via incremental neighborhood sketching. IEEE Transactions on Knowledge and Data Engineering 35, 5 (2022), 5296–5310.
  28. ROLAND: graph learning framework for dynamic graphs. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2358–2366.
  29. Graph neural networks and their current applications in bioinformatics. Frontiers in genetics 12 (2021), 690049.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com