Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ASWT-SGNN: Adaptive Spectral Wavelet Transform-based Self-Supervised Graph Neural Network (2312.05736v1)

Published 10 Dec 2023 in cs.LG and eess.SP

Abstract: Graph Comparative Learning (GCL) is a self-supervised method that combines the advantages of Graph Convolutional Networks (GCNs) and comparative learning, making it promising for learning node representations. However, the GCN encoders used in these methods rely on the Fourier transform to learn fixed graph representations, which is inherently limited by the uncertainty principle involving spatial and spectral localization trade-offs. To overcome the inflexibility of existing methods and the computationally expensive eigen-decomposition and dense matrix multiplication, this paper proposes an Adaptive Spectral Wavelet Transform-based Self-Supervised Graph Neural Network (ASWT-SGNN). The proposed method employs spectral adaptive polynomials to approximate the filter function and optimize the wavelet using contrast loss. This design enables the creation of local filters in both spectral and spatial domains, allowing flexible aggregation of neighborhood information at various scales and facilitating controlled transformation between local and global information. Compared to existing methods, the proposed approach reduces computational complexity and addresses the limitation of graph convolutional neural networks, which are constrained by graph size and lack flexible control over the neighborhood aspect. Extensive experiments on eight benchmark datasets demonstrate that ASWT-SGNN accurately approximates the filter function in high-density spectral regions, avoiding costly eigen-decomposition. Furthermore, ASWT-SGNN achieves comparable performance to state-of-the-art models in node classification tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (50)
  1. Geometric multimodal deep learning with multiscaled graph wavelet convolutional network. IEEE Transactions on Neural Networks and Learning Systems, 1–15.
  2. Spectral networks and locally connected networks on graphs. arXiv:1312.6203.
  3. Simple and deep graph convolutional networks. In International conference on machine learning, 1725–1735. PMLR.
  4. Convolutional neural networks on graphs with fast localized spectral filtering. In Proceedings of the 30th International Conference on Neural Information Processing Systems, 3844–3852.
  5. Efficient estimation of eigenvalue counts in an interval. Numerical Linear Algebra with Applications, 23(4): 674–692.
  6. Data augmentation for deep graph learning: A survey. ACM SIGKDD Explorations Newsletter, 24(2): 61–77.
  7. Graph neural tangent kernel: Fusing graph neural networks with graph kernels. Advances in neural information processing systems, 32.
  8. Spectrum-adapted polynomial approximation for matrix functions with applications in graph signal processing. Algorithms, 13(11): 295.
  9. Ma-gcl: Model augmentation tricks for graph contrastive learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 37, 4284–4292.
  10. Inductive representation learning on large graphs. Advances in neural information processing systems, 30.
  11. Wavelets on graphs via spectral graph theory. Applied and Computational Harmonic Analysis, 30(2): 129–150.
  12. Semi-implicit graph variational auto-encoders. Advances in neural information processing systems, 32.
  13. Contrastive multi-view representation learning on graphs. In International conference on machine learning, 4116–4126. PMLR.
  14. Block modeling-guided graph convolutional neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, 4022–4029.
  15. Momentum Contrast for Unsupervised Visual Representation Learning. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 9726–9735. IEEE Computer Society.
  16. Graphmae: Self-supervised masked graph autoencoders. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 594–604.
  17. Hutchinson, M. F. 1989. A stochastic estimator of the trace of the influence matrix for Laplacian smoothing splines. Communications in Statistics-Simulation and Computation, 18(3): 1059–1076.
  18. Accelerating training and inference of graph neural networks with fast sampling and pipelining. Proceedings of Machine Learning and Systems, 4: 172–189.
  19. Semi-supervised classification with graph convolutional networks. arXiv:1609.02907.
  20. Contrastive self-supervised learning: review, progress, challenges and future research directions. International Journal of Multimedia Information Retrieval, 11(4): 461–488.
  21. What’s Behind the Mask: Understanding Masked Graph Modeling for Graph Autoencoders. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 1268–1279.
  22. Wiki-cs: A wikipedia-based benchmark for graph neural networks. arXiv:2007.02901.
  23. Graph representation learning via graphical mutual information maximization. In Proceedings of The Web Conference 2020, 259–270.
  24. Weisfeiler-lehman graph kernels. Journal of Machine Learning Research, 12(9).
  25. Multi-label graph convolutional network representation learning. IEEE Transactions on Big Data, 8(5): 1169–1181.
  26. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE signal processing magazine, 30(3): 83–98.
  27. Infograph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization. arXiv:1908.01000.
  28. Adversarial graph augmentation to improve graph contrastive learning. Advances in Neural Information Processing Systems, 34: 15920–15933.
  29. A probing method for computing the diagonal of a matrix inverse. Numerical Linear Algebra with Applications, 19(3): 485–501.
  30. Bootstrapped representation learning on graphs. In ICLR 2021 Workshop on Geometrical and Topological Representation Learning.
  31. Signals on graphs: Uncertainty principle and sampling. IEEE Transactions on Signal Processing, 64(18): 4845–4860.
  32. Visualizing data using t-SNE. Journal of machine learning research, 9(11).
  33. Graph attention networks. stat, 1050(20): 10–48550.
  34. Deep Graph Infomax. ICLR (Poster), 2(3): 4.
  35. Augmentation-Free Graph Contrastive Learning with Performance Guarantee. arXiv:2204.04874.
  36. Uncovering the Structural Fairness in Graph Contrastive Learning. Advances in Neural Information Processing Systems, 35: 32465–32473.
  37. Dynamic graph cnn for learning on point clouds. ACM Transactions on Graphics (tog), 38(5): 1–12.
  38. Causality Based Propagation History Ranking in Social Networks. In IJCAI, 3917–3923.
  39. Self-supervised learning of graph neural networks: A unified review. IEEE transactions on pattern analysis and machine intelligence, 45(2): 2412–2429.
  40. Graph wavelet neural network. arXiv:1904.07785.
  41. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826.
  42. Deep graph kernels. In Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining, 1365–1374.
  43. Revisiting semi-supervised learning with graph embeddings. In International conference on machine learning, 40–48. PMLR.
  44. Hierarchical graph representation learning with differentiable pooling. Advances in neural information processing systems, 31.
  45. Graph contrastive learning with augmentations. Advances in neural information processing systems, 33: 5812–5823.
  46. Forecasting road traffic speeds by considering area-wide spatio-temporal dependencies based on a graph convolutional neural network (GCN). Transportation research part C: emerging technologies, 114: 189–204.
  47. Accurate, efficient and scalable graph embedding. In 2019 IEEE International Parallel and Distributed Processing Symposium (IPDPS), 462–471. IEEE.
  48. Graphsaint: Graph sampling based inductive learning method. arXiv:1907.04931.
  49. From canonical correlation analysis to self-supervised graph neural networks. Advances in Neural Information Processing Systems, 34: 76–89.
  50. Graph contrastive learning with adaptive augmentation. In Proceedings of the Web Conference 2021, 2069–2080.
Citations (1)

Summary

We haven't generated a summary for this paper yet.