Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 73 tok/s
Gemini 2.5 Pro 42 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 85 tok/s Pro
Kimi K2 202 tok/s Pro
GPT OSS 120B 464 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

DyG2Vec: Efficient Representation Learning for Dynamic Graphs (2210.16906v3)

Published 30 Oct 2022 in cs.LG, cs.AI, and cs.SI

Abstract: Temporal graph neural networks have shown promising results in learning inductive representations by automatically extracting temporal patterns. However, previous works often rely on complex memory modules or inefficient random walk methods to construct temporal representations. To address these limitations, we present an efficient yet effective attention-based encoder that leverages temporal edge encodings and window-based subgraph sampling to generate task-agnostic embeddings. Moreover, we propose a joint-embedding architecture using non-contrastive SSL to learn rich temporal embeddings without labels. Experimental results on 7 benchmark datasets indicate that on average, our model outperforms SoTA baselines on the future link prediction task by 4.23% for the transductive setting and 3.30% for the inductive setting while only requiring 5-10x less training/inference time. Lastly, different aspects of the proposed framework are investigated through experimental analysis and ablation studies. The code is publicly available at https://github.com/huawei-noah/noah-research/tree/master/graph_atlas.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (44)
  1. Contrastive and non-contrastive self-supervised learning recover global and local spectral embedding methods. In Proc. Adv. Neural Inf. Proc. Systems, 2022.
  2. VICReg: Variance-invariance-covariance regularization for self-supervised learning. In Proc. Int. Conf. Learning Representations (ICLR), 2022.
  3. Signature verification using a "siamese" time delay neural network. In Proc. Adv. Neural Inf. Proc. Systems, 1993.
  4. A simple framework for contrastive learning of visual representations. In Proc. Int. Conf. on Machine Learning, 2020a.
  5. Improved baselines with momentum contrastive learning. arXiv preprint arXiv:2003.04297, 2020b.
  6. Dyformer : A scalable dynamic graph transformer with provable benefits on generalization ability. SIAM International Conference on Data Mining, 2023a.
  7. Do we really need complicated model architectures for temporal networks? In Proc. Int. Conf. Learning Representations (ICLR), 2023b.
  8. Time-series representation learning via temporal and contextual contrasting. In Proc. Int. Joint Conf. on Artificial Intelligence, 2021.
  9. Fast graph representation learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
  10. On the duality between contrastive and non-contrastive self-supervised learning. In Proc. Int. Conf. Learning Representations (ICLR), 2023.
  11. Neural message passing for quantum chemistry. In Proc. Int. Conf. Machine Learning, 2017.
  12. Bootstrap your own latent - a new approach to self-supervised learning. In Proc. Adv. Neural Inf. Proc. Systems, 2020.
  13. Momentum contrast for unsupervised visual representation learning. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition), 2020.
  14. Self-supervised dynamic graph representation learning via temporal subgraph contrast. arXiv preprint arXiv:2112.08733, 2021.
  15. Sub-graph contrast for scalable self-supervised graph representation learning. IEEE International Conference on Data Mining (ICDM), 2020.
  16. Neural temporal walks: Motif-aware representation learning on continuous-time dynamic graphs. In Proc. Adv. Neural Inf. Proc. Systemss, 2022.
  17. Understanding dimensional collapse in contrastive self-supervised learning. In Proc. Int. Conf. Learning Representations (ICLR), 2022.
  18. Time2vec: Learning a vector representation of time. arXiv preprint arXiv:1907.05321, 2019.
  19. Representation learning for dynamic graphs: A survey. Journal of Machine Learning Research, 2020.
  20. Adam: A method for stochastic optimization. In Proc. Int. Conf. Learning Representations (ICLR), 2015.
  21. Temporal motifs in time-dependent networks. Journal of Statistical Mechanics: Theory and Experiment, 2011.
  22. Predicting dynamic embedding trajectory in temporal interaction networks. In Proc. Int. Conf. on Knowledge Discovery & Data Mining, 2019.
  23. Self-supervised learning: Generative or contrastive. IEEE Transactions on Knowledge and Data Engineering, 2021.
  24. Neighborhood-aware scalable temporal network representation learning. In The First Learning on Graphs Conference, 2022.
  25. Motifs in temporal networks. In ACM Int. Conf. on Web Search and Data Mining, 2017.
  26. Evolvegcn: Evolving graph convolutional networks for dynamic graphs. Proc. of the AAAI Conference on Artificial Intelligence, 2020.
  27. Pytorch: An imperative style, high-performance deep learning library. In Proc. Adv. Neural Inf. Proc. Systems. 2019.
  28. Multi-modal self-supervision from generalized data transformations. In Proc. Int. Conf. on Computer Vision, 2021.
  29. Anomaly detection in dynamic networks: a survey. WIREs Computational Statistics, 7(3):223–247, 2015.
  30. Temporal graph networks for deep learning on dynamic graphs. In ICML Workshop on Graph Representation Learning, 2020.
  31. Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In Proc. Int. Conf. on Web Search and Data Mining, 2020.
  32. Provably expressive temporal graph networks. In Proc. Adv. Neural Inf. Proc. Systems, 2022.
  33. Large-scale representation learning on graphs via bootstrapping. In Proc. Int. Conf. Learning Representations (ICLR), 2022.
  34. Self-supervised representation learning on dynamic graphs. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2021.
  35. VideoMAE: Masked autoencoders are data-efficient learners for self-supervised video pre-training. In Proc. Adv. Neural Inf. Proc. Systems, 2022.
  36. Dyrep: Learning representations over dynamic graphs. In Proc. Int. Conf. Learning Representations (ICLR), 2019.
  37. Attention is all you need. In Proc. Adv. Neural Inf. Proc. Systems, 2017.
  38. Graph attention networks. In Proc. Int. Conf. Learning Representations (ICLR), 2018.
  39. Tcl: Transformer-based dynamic graph modelling via contrastive learning. arXiv preprint arXiv:2105.07944, 2021a.
  40. Inductive representation learning in temporal networks via causal anonymous walks. In Proc. Int. Conf. Learning Representations (ICLR), 2021b.
  41. Self-supervised learning of graph neural networks: A unified review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
  42. Inductive representation learning on temporal graphs. Proc. Int. Conf. Learning Representations (ICLR), 2020.
  43. Link prediction based on common-neighbors for dynamic social network. Procedia Computer Science, 2016.
  44. Tgl: A general framework for temporal gnn training on billion-scale graphs. Proc. VLDB Endow., 2022.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.