Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Enhancing Signed Graph Neural Networks through Curriculum-Based Training (2310.11083v2)

Published 17 Oct 2023 in cs.LG

Abstract: Signed graphs are powerful models for representing complex relations with both positive and negative connections. Recently, Signed Graph Neural Networks (SGNNs) have emerged as potent tools for analyzing such graphs. To our knowledge, no prior research has been conducted on devising a training plan specifically for SGNNs. The prevailing training approach feeds samples (edges) to models in a random order, resulting in equal contributions from each sample during the training process, but fails to account for varying learning difficulties based on the graph's structure. We contend that SGNNs can benefit from a curriculum that progresses from easy to difficult, similar to human learning. The main challenge is evaluating the difficulty of edges in a signed graph. We address this by theoretically analyzing the difficulty of SGNNs in learning adequate representations for edges in unbalanced cycles and propose a lightweight difficulty measurer. This forms the basis for our innovative Curriculum representation learning framework for Signed Graphs, referred to as CSG. The process involves using the measurer to assign difficulty scores to training samples, adjusting their order using a scheduler and training the SGNN model accordingly. We empirically our approach on six real-world signed graph datasets. Our method demonstrates remarkable results, enhancing the accuracy of popular SGNN models by up to 23.7% and showing a reduction of 8.4% in standard deviation, enhancing model stability.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (49)
  1. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
  2. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
  3. Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
  4. User: Unsupervised structural entropy-based robust graph neural network. arXiv preprint arXiv:2302.05889, 2023.
  5. Signed graph convolutional networks. In 2018 IEEE International Conference on Data Mining (ICDM), pages 929–934. IEEE, 2018.
  6. Asine: Adversarial signed network embedding. In Proceedings of the 43rd international acm sigir conference on research and development in information retrieval, pages 609–618, 2020.
  7. Learning signed network embedding via graph attention. In Proceedings of the AAAI Conference on Artificial Intelligence, pages 4772–4779, 2020.
  8. Signed bipartite graph neural networks. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pages 740–749, 2021.
  9. Sdgnn: Learning node representation for signed directed networks. arXiv preprint arXiv:2101.02390, 2021.
  10. Sgcl: Contrastive representation learning for signed graphs. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pages 1671–1680, 2021.
  11. Signed graph neural network with latent groups. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pages 1066–1075, 2021.
  12. Curriculum learning. In Proceedings of the 26th annual international conference on machine learning, pages 41–48, 2009.
  13. Curgraph: Curriculum learning for graph classification. In Proceedings of the Web Conference 2021, pages 1238–1248, 2021.
  14. Clnode: Curriculum learning for node classification. arXiv preprint arXiv:2206.07258, 2022.
  15. Jeffrey L Elman. Learning and development in neural networks: The importance of starting small. Cognition, 48(1):71–99, 1993.
  16. Language acquisition in the absence of explicit negative evidence: How important is starting small? Cognition, 72(1):67–109, 1999.
  17. Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives. arXiv preprint arXiv:1905.10847, 2019.
  18. Curriculum learning for natural language understanding. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6095–6104, 2020.
  19. Cuco: Graph representation with curriculum contrastive learning. In IJCAI, pages 2300–2306, 2021.
  20. " bridge" enhanced signed directed network embedding. In Proceedings of the 27th acm international conference on information and knowledge management, pages 773–782, 2018.
  21. Shine: Signed heterogeneous information network embedding for sentiment link prediction. In Proceedings of the eleventh ACM international conference on web search and data mining, pages 592–600, 2018.
  22. Contrastive learning for signed bipartite graphs. In Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 1629–1638, 2023.
  23. Rsgnn: A model-agnostic approach for enhancing the robustness of signed graph neural networks. In Proceedings of the ACM Web Conference 2023, pages 60–70, 2023.
  24. Node classification in signed social networks. In Proceedings of the 2016 SIAM international conference on data mining, pages 54–62. SIAM, 2016.
  25. Personalized ranking in signed networks using signed random walk with restart. In 2016 IEEE 16th International Conference on Data Mining (ICDM), pages 973–978. IEEE, 2016.
  26. Discovering polarized communities in signed networks. In Proceedings of the 28th acm international conference on information and knowledge management, pages 961–970, 2019.
  27. Sne: signed network embedding. In Pacific-Asia conference on knowledge discovery and data mining, pages 183–195. Springer, 2017.
  28. Side: representation learning in signed directed networks. In Proceedings of the 2018 World Wide Web Conference, pages 509–518, 2018.
  29. Signed graph diffusion network. arXiv preprint arXiv:2012.14191, 2020.
  30. Rose: Role-based signed network embedding. In Proceedings of The Web Conference 2020, pages 2782–2788, 2020.
  31. Signed graph attention networks. In International Conference on Artificial Neural Networks, pages 566–577. Springer, 2019.
  32. Acpl: Anti-curriculum pseudo-labelling for semi-supervised medical image classification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 20697–20706, 2022.
  33. Surpassing the human accuracy: Detecting gallbladder cancer from usg images with curriculum learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 20886–20896, 2022.
  34. Curriculumnet: Weakly supervised learning from large-scale web images. In Proceedings of the European conference on computer vision (ECCV), pages 135–150, 2018.
  35. Curriculum graph co-teaching for multi-target domain adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5351–5360, 2021.
  36. An imitation learning curriculum for text editing with non-autoregressive models. In Proceedings of ACL, pages 7550–7563, 2022.
  37. Dialogue response selection with hierarchical curriculum learning. In Proceedings of ACL, pages 1740–1751, 2021.
  38. Easy questions first? a case study on curriculum learning for question answering. In Proceedings of ACL, pages 453–463, 2016.
  39. Infograph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization. arXiv preprint arXiv:1908.01000, 2019.
  40. Sitting closer to friends than enemies, revisited. In International Symposium on Mathematical Foundations of Computer Science, pages 296–307. Springer, 2012.
  41. Yi Qian and Sibel Adali. Foundations of trust and distrust in networks: Extended structural balance theory. ACM Transactions on the Web (TWEB), 8(3):1–33, 2014.
  42. Predicting positive and negative links in online social networks. In Proceedings of the 19th international conference on World wide web, pages 641–650, 2010.
  43. The reduction of a graph to canonical form and the algebra which appears therein. NTI, Series, 2(9):12–16, 1968.
  44. How powerful are graph neural networks? In International Conference on Learning Representations, 2018.
  45. Edge weight prediction in weighted signed networks. In 2016 IEEE 16th International Conference on Data Mining (ICDM), pages 221–230. IEEE, 2016.
  46. Rev2: Fraudulent user prediction in rating platforms. In Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining, pages 333–341. ACM, 2018.
  47. Signed networks in social media. In Proceedings of the SIGCHI conference on human factors in computing systems, pages 1361–1370, 2010.
  48. Exploiting social network structure for person-to-person sentiment analysis. Transactions of the Association for Computational Linguistics, 2:297–310, 2014.
  49. Graph contrastive learning with adaptive augmentation. In WWW, 2021.
Citations (1)

Summary

We haven't generated a summary for this paper yet.