Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Through the Dual-Prism: A Spectral Perspective on Graph Data Augmentation for Graph Classification (2401.09953v2)

Published 18 Jan 2024 in cs.LG

Abstract: Graph Neural Networks (GNNs) have become the preferred tool to process graph data, with their efficacy being boosted through graph data augmentation techniques. Despite the evolution of augmentation methods, issues like graph property distortions and restricted structural changes persist. This leads to the question: Is it possible to develop more property-conserving and structure-sensitive augmentation methods? Through a spectral lens, we investigate the interplay between graph properties, their augmentation, and their spectral behavior, and found that keeping the low-frequency eigenvalues unchanged can preserve the critical properties at a large scale when generating augmented graphs. These observations inform our introduction of the Dual-Prism (DP) augmentation method, comprising DP-Noise and DP-Mask, which adeptly retains essential graph properties while diversifying augmented graphs. Extensive experiments validate the efficiency of our approach, providing a new and promising direction for graph data augmentation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (60)
  1. Sub2vec: Feature learning for subgraphs. In Advances in Knowledge Discovery and Data Mining: 22nd Pacific-Asia Conference, PAKDD 2018, Melbourne, VIC, Australia, June 3-6, 2018, Proceedings, Part II 22, pp.  170–182. Springer, 2018.
  2. Beyond low-frequency information in graph convolutional networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pp.  3950–3957, 2021.
  3. Spectral temporal graph neural network for multivariate time-series forecasting. Advances in neural information processing systems, 33:17766–17778, 2020.
  4. Not all low-pass filters are robust in graph convolutional networks. Advances in Neural Information Processing Systems, 34:25058–25071, 2021.
  5. Identification and classification of ncrna molecules using graph properties. Nucleic acids research, 37(9):e66–e66, 2009.
  6. Fan RK Chung. Spectral graph theory, volume 92. American Mathematical Soc., 1997.
  7. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
  8. Data augmentation for deep graph learning: A survey. ACM SIGKDD Explorations Newsletter, 24(2):61–77, 2022.
  9. Graph signal processing for machine learning: A review and new perspectives. IEEE Signal processing magazine, 37(6):117–127, 2020.
  10. All you need is low (rank) defending against adversarial attacks on graphs. In Proceedings of the 13th International Conference on Web Search and Data Mining, pp.  169–177, 2020.
  11. Graph random neural networks for semi-supervised learning on graphs. Advances in neural information processing systems, 33:22092–22103, 2020.
  12. Scalable kernels for graphs with continuous attributes. Advances in neural information processing systems, 26, 2013.
  13. Automatic chemical design using a data-driven continuous representation of molecules. ACS central science, 4(2):268–276, 2018.
  14. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pp.  855–864, 2016.
  15. Intrusion-free graph mixup. 2021.
  16. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
  17. Wavelets on graphs via spectral graph theory. Applied and Computational Harmonic Analysis, 30(2):129–150, 2011.
  18. Graph diffusion distance: A difference measure for weighted graphs based on the graph laplacian exponential kernel. In 2013 IEEE global conference on signal and information processing, pp.  419–422. IEEE, 2013.
  19. G-mixup: Graph data augmentation for graph classification. In International Conference on Machine Learning, pp.  8230–8248. PMLR, 2022.
  20. Contrastive multi-view representation learning on graphs. In International conference on machine learning, pp.  4116–4126. PMLR, 2020.
  21. Open graph benchmark: Datasets for machine learning on graphs. Advances in neural information processing systems, 33:22118–22133, 2020a.
  22. Strategies for pre-training graph neural networks. In International Conference on Learning Representations, 2020b. URL https://openreview.net/forum?id=HJlWWJSFDH.
  23. Spatio-temporal graph neural networks for predictive learning in urban computing: A survey. arXiv preprint arXiv:2303.14483, 2023a.
  24. How expressive are spectral-temporal graph neural networks for time series forecasting? arXiv preprint arXiv:2305.06587, 2023b.
  25. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016a.
  26. Variational graph auto-encoders. arXiv preprint arXiv:1611.07308, 2016b.
  27. Semi-supervised classification with graph convolutional networks. In In International Conference on Learning Representations, 2017.
  28. Multiway spectral partitioning and higher-order cheeger inequalities. Journal of the ACM (JACM), 61(6):1–30, 2014.
  29. Spectral augmentation for self-supervised learning on graphs. In The Eleventh International Conference on Learning Representations, 2022.
  30. Graph mixup with soft alignments. arXiv preprint arXiv:2306.06788, 2023.
  31. Revisiting graph contrastive learning from the perspective of graph spectrum. Advances in Neural Information Processing Systems, 35:2972–2983, 2022.
  32. JF Lutzeyer and AT Walden. Comparing graph spectra of adjacency and laplacian matrices. arXiv preprint arXiv:1712.03769, 2017.
  33. Tudataset: A collection of benchmark datasets for learning with graphs. arXiv preprint arXiv:2007.08663, 2020.
  34. graph2vec: Learning distributed representations of graphs. arXiv preprint arXiv:1707.05005, 2017.
  35. Hypothesis testing with efficient method of moments estimation. International Economic Review, pp.  777–787, 1987.
  36. Random graph models of social networks. Proceedings of the national academy of sciences, 99(suppl_1):2566–2572, 2002.
  37. Revisiting graph neural networks: All we have is low-pass filters. arXiv preprint arXiv:1905.09550, 2019.
  38. Graph signal processing: Overview, challenges, and applications. Proceedings of the IEEE, 106(5):808–828, 2018.
  39. Graph transplant: Node saliency-guided graph mixup with local structure preservation. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pp.  7966–7974, 2022.
  40. Nataša Pržulj. Biological network comparison using graphlet degree distribution. Bioinformatics, 23(2):e177–e183, 2007.
  41. Dropedge: Towards deep graph convolutional networks on node classification. In International Conference on Learning Representations, 2019.
  42. Weisfeiler-lehman graph kernels. Journal of Machine Learning Research, 12(9), 2011.
  43. Zinc 15–ligand discovery for everyone. Journal of chemical information and modeling, 55(11):2324–2337, 2015.
  44. Infograph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization. In International Conference on Learning Representations, 2019.
  45. Adversarial graph augmentation to improve graph contrastive learning. Advances in Neural Information Processing Systems, 34:15920–15933, 2021.
  46. Deep graph infomax. In International Conference on Learning Representations, 2018.
  47. Manifold mixup: Better representations by interpolating hidden states. In International conference on machine learning, pp.  6438–6447. PMLR, 2019.
  48. Augmentation-free graph contrastive learning with performance guarantee. arXiv preprint arXiv:2204.04874, 2022.
  49. Mixup for node and graph classification. In Proceedings of the Web Conference 2021, pp.  3663–3674, 2021.
  50. Metrics for graph comparison: a practitioner’s guide. Plos one, 15(2):e0228728, 2020.
  51. Simplifying graph convolutional networks. In International conference on machine learning, pp.  6861–6871. PMLR, 2019.
  52. How powerful are graph neural networks? In International Conference on Learning Representations, 2018.
  53. Deep graph kernels. In Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining, pp.  1365–1374, 2015.
  54. A new perspective on the effects of spectrum in graph neural networks. In International Conference on Machine Learning, pp.  25261–25279. PMLR, 2022.
  55. Graph representation learning in bioinformatics: trends, methods and applications. Briefings in Bioinformatics, 23(1):bbab340, 2022.
  56. Model-agnostic augmentation for accurate graph classification. In Proceedings of the ACM Web Conference 2022, pp.  1281–1291, 2022.
  57. Graph contrastive learning with augmentations. Advances in neural information processing systems, 33:5812–5823, 2020.
  58. Graph contrastive learning automated. In International Conference on Machine Learning, pp.  12121–12132. PMLR, 2021.
  59. mixup: Beyond empirical risk minimization. arXiv preprint arXiv:1710.09412, 2017.
  60. Graph data augmentation for graph machine learning: A survey. arXiv preprint arXiv:2202.08871, 2022.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yutong Xia (16 papers)
  2. Runpeng Yu (19 papers)
  3. Yuxuan Liang (126 papers)
  4. Xavier Bresson (40 papers)
  5. Xinchao Wang (203 papers)
  6. Roger Zimmermann (76 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets