Papers
Topics
Authors
Recent
Search
2000 character limit reached

Lying Graph Convolution: Learning to Lie for Node Classification Tasks

Published 2 May 2024 in cs.LG and cs.SI | (2405.01247v1)

Abstract: In the context of machine learning for graphs, many researchers have empirically observed that Deep Graph Networks (DGNs) perform favourably on node classification tasks when the graph structure is homophilic (\ie adjacent nodes are similar). In this paper, we introduce Lying-GCN, a new DGN inspired by opinion dynamics that can adaptively work in both the heterophilic and the homophilic setting. At each layer, each agent (node) shares its own opinions (node embeddings) with its neighbours. Instead of sharing its opinion directly as in GCN, we introduce a mechanism which allows agents to lie. Such a mechanism is adaptive, thus the agents learn how and when to lie according to the task that should be solved. We provide a characterisation of our proposal in terms of dynamical systems, by studying the spectral property of the coefficient matrix of the system. While the steady state of the system collapses to zero, we believe the lying mechanism is still usable to solve node classification tasks. We empirically prove our belief on both synthetic and real-world datasets, by showing that the lying mechanism allows to increase the performances in the heterophilic setting without harming the results in the homophilic one.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. P. Reiser, M. Neubert, A. Eberhard, L. Torresi, C. Zhou, C. Shao, H. Metni, C. van Hoesel, H. Schopmans, T. Sommer et al., “Graph neural networks for materials science and chemistry,” Communications Materials, vol. 3, no. 1, p. 93, 2022.
  2. G. Carleo, I. Cirac, K. Cranmer, L. Daudet, M. Schuld, N. Tishby, L. Vogt-Maranto, and L. Zdeborová, “Machine learning and the physical sciences,” Reviews of Modern Physics, vol. 91, no. 4, p. 045002, 2019.
  3. W. Fan, Y. Ma, Q. Li, Y. He, E. Zhao, J. Tang, and D. Yin, “Graph neural networks for social recommendation,” in The world wide web conference, 2019, pp. 417–426.
  4. D. Bacciu, F. Errica, A. Micheli, and M. Podda, “A gentle introduction to deep learning for graphs,” Neural Networks, vol. 129, pp. 203–221, 9 2020.
  5. F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, “The graph neural network model,” IEEE Transactions on Neural Networks, vol. 20, no. 1, pp. 61–80, 2009.
  6. A. Micheli, “Neural network for graphs: A contextual constructive approach,” IEEE Transactions on Neural Networks, vol. 20, no. 3, pp. 498–511, 2009.
  7. D. Bacciu, F. Errica, and A. Micheli, “Probabilistic learning on graphs via contextual architectures,” Journal of Machine Learning Research, vol. 21, no. 134, pp. 1–39, 2020.
  8. J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in Proceedings of the 34th International Conference on Machine Learning (ICML), 2017.
  9. M. McPherson, L. Smith-Lovin, and J. M. Cook, “Birds of a feather: Homophily in social networks,” Annual review of sociology, vol. 27, no. 1, pp. 415–444, 2001.
  10. J. Zhu, Y. Yan, L. Zhao, M. Heimann, L. Akoglu, and D. Koutra, “Beyond homophily in graph neural networks: Current limitations and effective designs,” Proceedings of the 34th Conference on Neural Information Processing Systems (NeurIPS), 2020.
  11. S. Maekawa, K. Noda, Y. Sasaki, and m. onizuka, “Beyond real-world benchmark datasets: An empirical study of node classification with gnns,” in Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS), 2022.
  12. T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in International Conference on Learning Representations, 2016.
  13. C. Bodnar, F. Di Giovanni, B. Chamberlain, P. Lió, and M. Bronstein, “Neural Sheaf Diffusion: A Topological Perspective on Heterophily and Oversmoothing in GNNs,” Advances in Neural Information Processing Systems, vol. 35, pp. 18 527–18 541, Dec. 2022.
  14. J. Hansen and R. Ghrist, “Opinion Dynamics on Discourse Sheaves,” SIAM Journal on Applied Mathematics, vol. 81, no. 5, pp. 2033–2060, Jan. 2021, publisher: Society for Industrial and Applied Mathematics.
  15. R. Abelson, “Mathematical models of the distribution of attitudes under controversy,” in Contributions to mathematical psychology, N. Fredericksen and H. Gullicksen, Eds.   Holt, Rinehart & Winston, 1964.
  16. M. Taylor, “Towards a mathematical theory of influence and attitude change,” Human Relations, vol. 21, no. 2, pp. 121–139, 1968. [Online]. Available: https://doi.org/10.1177/001872676802100202
  17. N. E. Friedkin, “The problem of social control and coordination of complex systems in sociology: A look at the community cleavage problem,” IEEE Control Systems Magazine, vol. 35, no. 3, pp. 40–51, 2015.
  18. J. Curry, “Sheaves, Cosheaves and Applications,” Ph.D. dissertation, University of Pennsylvania, Dec. 2014.
  19. Y. Wang, Y. Wang, J. Yang, and Z. Lin, “Dissecting the diffusion process in linear graph convolutional networks,” in Advances in Neural Information Processing Systems, A. Beygelzimer, Y. Dauphin, P. Liang, and J. W. Vaughan, Eds., 2021. [Online]. Available: https://openreview.net/forum?id=N51zJ7F3mw
  20. Q. Li, Z. Han, and X.-M. Wu, “Deeper insights into graph convolutional networks for semi-supervised learning,” in Proceedings of the 32nd AAAI Conference on Artificial Intelligence (AAAI), 2018.
  21. D. Chen, Y. Lin, W. Li, P. Li, J. Zhou, and X. Sun, “Measuring and relieving the over-smoothing problem for graph neural networks from the topological view,” in Proceedings of the 34th AAAI Conference on Artificial Intelligence (AAAI), 2020, pp. 3438–3445.
  22. Y. Yan, M. Hashemi, K. Swersky, Y. Yang, and D. Koutra, “Two sides of the same coin: Heterophily and oversmoothing in graph convolutional neural networks,” in 2022 IEEE International Conference on Data Mining (ICDM).   IEEE, 2022, pp. 1287–1292.
  23. P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” in 6th International Conference on Learning Representations (ICLR), 2018.
  24. M. Brockschmidt, “GNN-FiLM: Graph neural networks with feature-wise linear modulation,” in Proceedings of the 37th International Conference on Machine Learning, ser. Proceedings of Machine Learning Research, H. D. III and A. Singh, Eds., vol. 119.   PMLR, 13–18 Jul 2020, pp. 1144–1152. [Online]. Available: https://proceedings.mlr.press/v119/brockschmidt20a.html
  25. D. Bo, X. Wang, C. Shi, and H. Shen, “Beyond low-frequency information in graph convolutional networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 5, 2021, pp. 3950–3957.
  26. J. Zhu, Y. Yan, L. Zhao, M. Heimann, L. Akoglu, and D. Koutra, “Beyond homophily in graph neural networks: Current limitations and effective designs,” Advances in neural information processing systems, vol. 33, pp. 7793–7804, 2020.
  27. S. K. Maurya, X. Liu, and T. Murata, “Simplifying approach to node classification in graph neural networks,” Journal of Computational Science, vol. 62, p. 101695, 2022.
  28. M. Chen, Z. Wei, Z. Huang, B. Ding, and Y. Li, “Simple and Deep Graph Convolutional Networks,” in Proceedings of the 37th International Conference on Machine Learning.   PMLR, Nov. 2020, pp. 1725–1735, iSSN: 2640-3498.
  29. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in 3rd International Conference on Learning Representations (ICLR), 2015.
  30. D. Castellana and F. Errica, “Investigating the interplay between features and structures in graph learning,” in 20th International Workshop on Mining and Learning with Graphs at ECMLPKDD, 2023.
  31. Y. Ma, X. Liu, N. Shah, and J. Tang, “Is homophily a necessity for graph neural networks?” in 10th International Conference on Learning Representations (ICLR), 2022.
  32. A. Cavallo, C. Grohnfeldt, M. Russo, G. Lovisotto, and L. Vassio, “2-hop neighbor class similarity (2ncs): A graph structural metric indicative of graph neural network performance,” Workshop on Graphs and more Complex structures for Learning and Reasoning (AAAI), 2023.
  33. L. Van der Maaten and G. Hinton, “Visualizing data using t-sne.” Journal of machine learning research, vol. 9, no. 11, 2008.
  34. P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Gallagher, and T. Eliassi-Rad, “Collective classification in network data,” AI Magazine, vol. 29, no. 3, pp. 93–106, 2008. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1609/aimag.v29i3.2157
  35. J. Tang, J. Sun, C. Wang, and Z. Yang, “Social influence analysis in large-scale networks,” in Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, ser. KDD ’09.   New York, NY, USA: Association for Computing Machinery, 2009, p. 807–816. [Online]. Available: https://doi.org/10.1145/1557019.1557108
  36. H. Pei, B. Wei, K. C.-C. Chang, Y. Lei, and B. Yang, “Geom-gcn: Geometric graph convolutional networks,” in International Conference on Learning Representations, 2020.
  37. O. Platonov, D. Kuznedelev, M. Diskin, A. Babenko, and L. Prokhorenkova, “A critical look at the evaluation of GNNs under heterophily: Are we really making progress?” in 11th International Conference on Learning Representations (ICLR), 2023.
  38. I. Loshchilov and F. Hutter, “Decoupled weight decay regularization,” in International Conference on Learning Representations, 2019. [Online]. Available: https://openreview.net/forum?id=Bkg6RiCqY7
  39. M. Chen, Z. Wei, Z. Huang, B. Ding, and Y. Li, “Simple and deep graph convolutional networks,” in Proceedings of the 37th International Conference on Machine Learning, ser. Proceedings of Machine Learning Research, H. D. III and A. Singh, Eds., vol. 119.   PMLR, 13–18 Jul 2020, pp. 1725–1735. [Online]. Available: https://proceedings.mlr.press/v119/chen20v.html
  40. A. Gravina, D. Bacciu, and C. Gallicchio, “Anti-symmetric DGN: a stable architecture for deep graph networks,” in The Eleventh International Conference on Learning Representations, 2023. [Online]. Available: https://openreview.net/forum?id=J3Y7cgZOOS
  41. D. Castellana and D. Bacciu, “Learning from non-binary constituency trees via tensor decomposition,” in Proceedings of the 28th International Conference on Computational Linguistics.   International Committee on Computational Linguistics, dec 2020, pp. 3899–3910. [Online]. Available: https://aclanthology.org/2020.coling-main.346
  42. ——, “A tensor framework for learning in structured domains,” Neurocomputing, 2021. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0925231221011164
  43. C. Hua, G. Rabusseau, and J. Tang, “High-order pooling for graph neural networks with tensor decomposition,” Advances in Neural Information Processing Systems, vol. 35, pp. 6021–6033, 2022.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.