Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Learning in 4D: a Quaternion-valued Laplacian to Enhance Spectral GCNs (2312.17361v1)

Published 28 Dec 2023 in cs.LG

Abstract: We introduce QuaterGCN, a spectral Graph Convolutional Network (GCN) with quaternion-valued weights at whose core lies the Quaternionic Laplacian, a quaternion-valued Laplacian matrix by whose proposal we generalize two widely-used Laplacian matrices: the classical Laplacian (defined for undirected graphs) and the complex-valued Sign-Magnetic Laplacian (proposed to handle digraphs with weights of arbitrary sign). In addition to its generality, our Quaternionic Laplacian is the only Laplacian to completely preserve the topology of a digraph, as it can handle graphs and digraphs containing antiparallel pairs of edges (digons) of different weights without reducing them to a single (directed or undirected) edge as done with other Laplacians. Experimental results show the superior performance of QuaterGCN compared to other state-of-the-art GCNs, particularly in scenarios where the information the digons carry is crucial to successfully address the task at hand.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. Attention is all you need. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017.
  2. Machine learning facilitated business intelligence (part i) neural networks learning algorithms and applications. Industrial Management & Data Systems, 120(1):164–195, 2020.
  3. Deep learning algorithms for cybersecurity applications: A technological and status review. Computer Science Review, 39:100317, 2021.
  4. Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine, 34(4):18–42, 2017.
  5. A comprehensive survey of graph neural networks for knowledge graphs. IEEE Access, 10:75729–75741, 2022. doi: 10.1109/ACCESS.2022.3191784.
  6. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
  7. Graph attention networks. In International Conference on Learning Representations, 2018.
  8. Gated graph sequence neural networks. In Proceedings of ICLR’16, April 2016.
  9. Semi-supervised classification with graph convolutional networks. In 5th International Conference on Learning Representations, ICLR 2017 - Conference Track Proceedings, 2017.
  10. PyTorch Geometric Signed Directed: A Software Package on Graph Neural Networks for Signed and Directed Graphs. arXiv preprint arXiv:2202.10793, 2022a.
  11. Graph collaborative filtering for recommendation in complex and quaternion spaces. In Proc. of Web Information Systems Engineering (WISE 2022): 23rd International Conference, pages 579–594. Springer, 2022.
  12. Magnet: A neural network for directed graphs, 2021a.
  13. Sigmanet: One laplacian to rule them all. arXiv preprint arXiv:2205.13459, 2022.
  14. Fluxes, Laplacians, and Kasteleyn’s theorem. In Statistical Mechanics, pages 457–483. Springer, 1993.
  15. Msgnn: A spectral graph neural network based on a novel magnetic signed laplacian. In Learning on Graphs Conference, pages 40–1. PMLR, 2022b.
  16. Taewook Ko. A graph convolution for signed directed graphs. arXiv preprint arXiv:2208.11511, 2022.
  17. Quaternion graph neural networks. In Asian conference on machine learning, pages 236–251. PMLR, 2021.
  18. Quaternion-based knowledge graph neural network for social recommendation. Knowledge-Based Systems, 257:109940, 2022.
  19. Knowledge graph embedding with the special orthogonal group in quaternion space for link prediction. Knowledge-Based Systems, page 110400, 2023.
  20. smgc: A complex-valued graph convolutional network via magnetic laplacian for directed graphs, 2021b.
  21. Magnetic eigenmaps for community detection in directed networks. Physical Review E, 95(2), Feb 2017. ISSN 2470-0053.
  22. William Rowan Hamilton. Elements of quaternions. Longmans, Green, & Company, 1866.
  23. Yan-Bin Jia. Quaternions and rotations. Com S, 477(577):15, 2008.
  24. Initialization techniques for 3d slam: A survey on rotation estimation and its use in pose graph optimization. In 2015 IEEE international conference on robotics and automation (ICRA), pages 4597–4604. IEEE, 2015.
  25. A note on quaternion skew-symmetric matrices. arXiv preprint arXiv:2110.09282, 2021.
  26. Wavelets on graphs via spectral graph theory. Applied and Computational Harmonic Analysis, 30(2):129–150, 2011.
  27. Quaternion recurrent neural networks. In International Conference on Learning Representations, 2019.
  28. A survey of quaternion neural networks. Artificial Intelligence Review, 53:2957–2982, 2020.
  29. Edge weight prediction in weighted signed networks. In 2016 IEEE 16th International Conference on Data Mining (ICDM), pages 221–230, 2016. doi: 10.1109/ICDM.2016.0033.
  30. The activity of the far right on Telegram. ResearchGate preprint, DOI: 10.13140/RG.2.2.16700.05764:1–19, 2020.
  31. Exploiting social network structure for person-to-person sentiment analysis. Transactions of the Association for Computational Linguistics, 2:297–310, 2014.
  32. Signed networks in social media. In Proceedings of the SIGCHI conference on human factors in computing systems, pages 1361–1370, 2010.
  33. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
  34. Directed graph convolutional network, 2020a.
  35. Digraph inception convolutional networks. Advances in Neural Information Processing Systems, 2020-December(NeurIPS):1–12, 2020b. ISSN 10495258.
  36. Directed graph contrastive learning. Advances in Neural Information Processing Systems, 34:19580–19593, 2021.
  37. Predict then propagate: Graph neural networks meet personalized pagerank. In Proceedings of the 7th International Conference on Learning Representations, pages 1–15, 2019.
  38. Inductive representation learning on large graphs. Advances in neural information processing systems, 30:1–11, 2017.
  39. Sssnet: Semi-supervised signed network clustering. In Proceedings of the 2022 SIAM International Conference on Data Mining (SDM), pages 244–252. SIAM, 2022c.
  40. Signed graph convolutional networks. In 2018 IEEE International Conference on Data Mining (ICDM), pages 929–934. IEEE, 2018.
  41. Signed graph attention networks. In International Conference on Artificial Neural Networks, pages 566–577. Springer, 2019.
  42. Learning signed network embedding via graph attention. In Proceedings of the AAAI conference on artificial intelligence, volume 34, pages 4772–4779, 2020.
  43. Sdgnn: Learning node representation for signed directed networks. 35:196–203, May 2021. URL https://ojs.aaai.org/index.php/AAAI/article/view/16093.
Citations (5)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com