Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CTRL: Continuous-Time Representation Learning on Temporal Heterogeneous Information Network (2405.08013v1)

Published 11 May 2024 in cs.LG, cs.AI, and cs.SI

Abstract: Inductive representation learning on temporal heterogeneous graphs is crucial for scalable deep learning on heterogeneous information networks (HINs) which are time-varying, such as citation networks. However, most existing approaches are not inductive and thus cannot handle new nodes or edges. Moreover, previous temporal graph embedding methods are often trained with the temporal link prediction task to simulate the link formation process of temporal graphs, while ignoring the evolution of high-order topological structures on temporal graphs. To fill these gaps, we propose a Continuous-Time Representation Learning (CTRL) model on temporal HINs. To preserve heterogeneous node features and temporal structures, CTRL integrates three parts in a single layer, they are 1) a \emph{heterogeneous attention} unit that measures the semantic correlation between nodes, 2) a \emph{edge-based Hawkes process} to capture temporal influence between heterogeneous nodes, and 3) \emph{dynamic centrality} that indicates the dynamic importance of a node. We train the CTRL model with a future event (a subgraph) prediction task to capture the evolution of the high-order network structure. Extensive experiments have been conducted on three benchmark datasets. The results demonstrate that our model significantly boosts performance and outperforms various state-of-the-art approaches. Ablation studies are conducted to demonstrate the effectiveness of the model design.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (39)
  1. Abien Fred Agarap. Deep learning using rectified linear units (relu). arXiv preprint arXiv:1803.08375, 2018.
  2. A survey on embedding dynamic graphs. ACM Computing Surveys (CSUR), 55(1):1–37, 2021.
  3. Network embedding and change modeling in dynamic heterogeneous networks. In Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 861–864, 2019.
  4. A comprehensive survey of graph embedding: Problems, techniques, and applications. IEEE Transactions on Knowledge and Data Engineering, 30(9):1616–1637, 2018.
  5. metapath2vec: Scalable representation learning for heterogeneous networks. In Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pages 135–144, 2017.
  6. Continuous-time sequential recommendation with temporal graph collaborative transformer. In Proceedings of the 30th ACM International Conference on Information and Knowledge Management. ACM, 2021.
  7. Inductive representation learning on large graphs. In NIPS, 2017.
  8. Alan G Hawkes. Spectra of some self-exciting and mutually exciting point processes. Biometrika, 58(1):83–90, 1971.
  9. Heterogeneous graph transformer. In WWW ’20: The Web Conference 2020, Taipei, Taiwan, April 20-24, 2020, pages 2704–2710. ACM / IW3C2, 2020.
  10. Temporal heterogeneous information network embedding. In IJCAI, pages 1470–1476, 2021.
  11. Dynamic heterogeneous graph embedding via heterogeneous hawkes process. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pages 388–403. Springer, 2021.
  12. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR), 2017.
  13. Distributed representations of sentences and documents. In International conference on machine learning, pages 1188–1196. PMLR, 2014.
  14. Heterogeneous dynamic graph attention network. In 2020 IEEE International Conference on Knowledge Graph (ICKG), pages 404–411. IEEE, 2020.
  15. Motif-preserving dynamic attributed network embedding. In Proceedings of the Web Conference 2021, pages 1629–1638, 2021.
  16. Temporal network embedding with micro-and macro-dynamics. In Proceedings of the 28th ACM international conference on information and knowledge management, pages 469–478, 2019.
  17. Evolvegcn: Evolving graph convolutional networks for dynamic graphs. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 5363–5370, 2020.
  18. Deepwalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 701–710, 2014.
  19. Temporal graph networks for deep learning on dynamic graphs. arXiv preprint arXiv:2006.10637, 2020.
  20. Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In Proceedings of the 13th international conference on web search and data mining, pages 519–527, 2020.
  21. Modeling relational data with graph convolutional networks. In European semantic web conference, pages 593–607. Springer, 2018.
  22. Mining heterogeneous information networks: principles and methodologies. Synthesis Lectures on Data Mining and Knowledge Discovery, 3(2):1–159, 2012.
  23. Hyperbolic variational graph neural network for modeling dynamic graphs. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 4375–4383, 2021.
  24. Line: Large-scale information network embedding. In Proceedings of the 24th international conference on world wide web, pages 1067–1077, 2015.
  25. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  26. Graph Attention Networks. International Conference on Learning Representations, 2018. accepted as poster.
  27. Heterogeneous graph attention network. In The world wide web conference, pages 2022–2032, 2019.
  28. Dynamic heterogeneous information network embedding with meta-path based proximity. IEEE Transactions on Knowledge and Data Engineering, 2020.
  29. Self-supervised heterogeneous graph neural network with co-contrastive learning. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pages 1726–1736, 2021.
  30. Apan: Asynchronous propagation attention network for real-time temporal graph embedding. In Proceedings of the 2021 International Conference on Management of Data, pages 2628–2638, 2021.
  31. Inductive representation learning in temporal networks via causal anonymous walks. In International Conference on Learning Representations, 2021.
  32. Trend: Temporal event and node dynamics for graph representation learning. In Proceedings of the Web Conference 2022, 2022.
  33. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
  34. Inductive representation learning on temporal graphs. In International Conference on Learning Representations (ICLR), 2020.
  35. Dhne: Network representation learning method for dynamic heterogeneous networks. IEEE Access, 7:134782–134792, 2019.
  36. Heterogeneous graph neural network. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pages 793–803, 2019.
  37. Dynamic network embedding by modeling triadic closure process. In Proceedings of the AAAI conference on artificial intelligence, volume 32, 2018.
  38. Tgl: A general framework for temporal gnn training on billion-scale graphs. arXiv preprint arXiv:2203.14883, 2022.
  39. Embedding temporal network via neighborhood formation. In Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining, pages 2857–2866, 2018.

Summary

We haven't generated a summary for this paper yet.