Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Loss-aware Curriculum Learning for Heterogeneous Graph Neural Networks (2402.18875v1)

Published 29 Feb 2024 in cs.LG

Abstract: Heterogeneous Graph Neural Networks (HGNNs) are a class of deep learning models designed specifically for heterogeneous graphs, which are graphs that contain different types of nodes and edges. This paper investigates the application of curriculum learning techniques to improve the performance and robustness of Heterogeneous Graph Neural Networks (GNNs). To better classify the quality of the data, we design a loss-aware training schedule, named LTS that measures the quality of every nodes of the data and incorporate the training dataset into the model in a progressive manner that increases difficulty step by step. LTS can be seamlessly integrated into various frameworks, effectively reducing bias and variance, mitigating the impact of noisy data, and enhancing overall accuracy. Our findings demonstrate the efficacy of curriculum learning in enhancing HGNNs capabilities for analyzing complex graph-structured data. The code is public at https://github.com/LARS-research/CLGNN/.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (15)
  1. Simple and deep graph convolutional networks. In International conference on machine learning, pages 1725–1735. PMLR.
  2. metapath2vec: Scalable representation learning for heterogeneous networks. In Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pages 135–144.
  3. Neural message passing for quantum chemistry. In International conference on machine learning, pages 1263–1272. PMLR.
  4. Inductive representation learning on large graphs. Advances in neural information processing systems, 30.
  5. Spectral heterogeneous graph convolutions via positive noncommutative polynomials. arXiv preprint arXiv:2305.19872.
  6. Efficient heterogeneous graph learning via random projection. arXiv preprint arXiv:2310.14481.
  7. Open graph benchmark: Datasets for machine learning on graphs. Advances in neural information processing systems, 33:22118–22133.
  8. Thomas N Kipf and Max Welling. 2016. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907.
  9. Random graph models of social networks. Proceedings of the national academy of sciences, 99(suppl_1):2566–2572.
  10. Line: Large-scale information network embedding. In Proceedings of the 24th international conference on world wide web, pages 1067–1077.
  11. Complex embeddings for simple link prediction. In International conference on machine learning, pages 2071–2080. PMLR.
  12. Microsoft academic graph: When experts are not enough. Quantitative Science Studies, 1(1):396–413.
  13. A survey on curriculum learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(9):4555–4576.
  14. Simplifying graph convolutional networks. In International conference on machine learning, pages 6861–6871. PMLR.
  15. Simple and efficient heterogeneous graph neural network. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 37, pages 10816–10824.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Zhen Hao Wong (6 papers)
  2. Hansi Yang (12 papers)
  3. Xiaoyi Fu (2 papers)
  4. Quanming Yao (102 papers)
Citations (1)