Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Parsing Networks (2402.14393v1)

Published 22 Feb 2024 in cs.LG

Abstract: Graph pooling compresses graph information into a compact representation. State-of-the-art graph pooling methods follow a hierarchical approach, which reduces the graph size step-by-step. These methods must balance memory efficiency with preserving node information, depending on whether they use node dropping or node clustering. Additionally, fixed pooling ratios or numbers of pooling layers are predefined for all graphs, which prevents personalized pooling structures from being captured for each individual graph. In this work, inspired by bottom-up grammar induction, we propose an efficient graph parsing algorithm to infer the pooling structure, which then drives graph pooling. The resulting Graph Parsing Network (GPN) adaptively learns personalized pooling structure for each individual graph. GPN benefits from the discrete assignments generated by the graph parsing algorithm, allowing good memory efficiency while preserving node information intact. Experimental results on standard benchmarks demonstrate that GPN outperforms state-of-the-art graph pooling methods in graph classification tasks while being able to achieve competitive performance in node classification tasks. We also conduct a graph reconstruction task to show GPN's ability to preserve node information and measure both memory and time efficiency through relevant tests.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (66)
  1. Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing. In international conference on machine learning, pp.  21–29. PMLR, 2019.
  2. On the bottleneck of graph neural networks and its practical implications. In International Conference on Learning Representations, 2020.
  3. Diffusion-convolutional neural networks. Advances in neural information processing systems, 29, 2016.
  4. Accurate learning of graph representations with graph multiset pooling. International Conference on Learning Representations, 2021.
  5. Spectral clustering with graph neural networks for graph pooling. In International Conference on Machine Learning, pp. 874–883. PMLR, 2020.
  6. Specformer: Spectral graph neural networks meet transformers. In The Eleventh International Conference on Learning Representations, 2023. URL https://openreview.net/forum?id=0pdSt3oyJa1.
  7. Neural sheaf diffusion: A topological perspective on heterophily and oversmoothing in gnns. Advances in Neural Information Processing Systems, 35:18527–18541, 2022.
  8. Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203, 2013.
  9. Simple and deep graph convolutional networks. In International Conference on Machine Learning, pp. 1725–1735. PMLR, 2020a.
  10. Iterative deep graph learning for graph neural networks: Better and robust node embeddings. Advances in neural information processing systems, 33:19314–19326, 2020b.
  11. Adaptive universal generalized pagerank graph neural network. International Conference on Learning Representations, 2021.
  12. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
  13. Frederik Diehl. Edge contraction pooling for graph neural networks. arXiv preprint arXiv:1905.10990, 2019.
  14. Unsupervised latent tree induction with deep inside-outside recursive auto-encoders. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp.  1129–1141, 2019.
  15. Higher-order clustering and pooling for graph neural networks. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, pp.  426–435, 2022.
  16. Fast graph representation learning with pytorch geometric. arXiv preprint arXiv:1903.02428, 2019.
  17. Graph u-nets. In international conference on machine learning, pp. 2083–2092. PMLR, 2019.
  18. Diffusion improves graph learning. Advances in neural information processing systems, 32, 2019.
  19. Neural message passing for quantum chemistry. In International conference on machine learning, pp. 1263–1272. PMLR, 2017.
  20. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
  21. Open graph benchmark: Datasets for machine learning on graphs. Advances in neural information processing systems, 33:22118–22133, 2020.
  22. The one hundred layers tiramisu: Fully convolutional densenets for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition workshops, pp.  11–19, 2017.
  23. Semi-supervised learning with graph learning-convolutional networks. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp.  11313–11320, 2019.
  24. Graph structure learning for robust graph neural networks. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pp.  66–74, 2020.
  25. Edge representation learning with hypergraphs. In Advances in Neural Information Processing Systems, 2021.
  26. Memory-based graph networks. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=r1laNeBYPB.
  27. Compound probabilistic context-free grammars for grammar induction. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp.  2369–2385, 2019a.
  28. Unsupervised recurrent neural network grammars. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp.  1105–1117, 2019b.
  29. Neural relational inference for interacting systems. In International Conference on Machine Learning, pp. 2688–2697. PMLR, 2018.
  30. Semi-supervised classification with graph convolutional networks. International Conference on Learning Representations, 2017.
  31. Corpus-based induction of syntactic structure: Models of dependency and constituency. In Proceedings of the 42nd annual meeting of the association for computational linguistics (ACL-04), pp.  478–485, 2004.
  32. Self-attention graph pooling. In International conference on machine learning, pp. 3734–3743. PMLR, 2019.
  33. SPGP: Structure prototype guided graph pooling. In NeurIPS 2022 Workshop: New Frontiers in Graph Learning, 2022. URL https://openreview.net/forum?id=z3SHKtoG5XZ.
  34. Encoding social information with graph convolutional networks forpolitical perspective detection in news media. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp.  2594–2604, 2019.
  35. Graph cross networks with vertex infomax pooling. Advances in Neural Information Processing Systems, 33:14093–14105, 2020.
  36. Gated graph sequence neural networks. International Conference on Learning Representations, 2016.
  37. Graph pooling for graph neural networks: Progress, challenges, and opportunities. Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, Survey Track, 2023.
  38. Tudataset: A collection of benchmark datasets for learning with graphs. arXiv preprint arXiv:2007.08663, 2020.
  39. Learning deconvolution network for semantic segmentation. In Proceedings of the IEEE international conference on computer vision, pp.  1520–1528, 2015.
  40. Geom-gcn: Geometric graph convolutional networks. International Conference on Learning Representations, 2020.
  41. Asap: Adaptive structure aware pooling for learning hierarchical graph representations. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pp.  5470–5477, 2020.
  42. U-net: Convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, October 5-9, 2015, Proceedings, Part III 18, pp.  234–241. Springer, 2015.
  43. The graph neural network model. IEEE transactions on neural networks, 20(1):61–80, 2008.
  44. Collective classification in network data. AI magazine, 29(3):93–93, 2008.
  45. Neural language modeling by jointly learning syntax and lexicon. In International Conference on Learning Representations, 2018a.
  46. Straight to the tree: Constituency parsing with neural syntactic distance. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp.  1171–1180, 2018b.
  47. Ordered neurons: Integrating tree structures into recurrent neural networks. In International Conference on Learning Representations, 2019. URL https://openreview.net/forum?id=B1l6qiR5F7.
  48. Structformer: Joint unsupervised induction of dependency and constituency structure from masked language modeling. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp.  7196–7209, 2021.
  49. Graph neural networks for communication networks: Context, use cases and opportunities. arXiv preprint arXiv:2112.14792, 2021.
  50. Breaking the limit of graph neural networks by improving the assortativity of graphs with local mixing patterns. Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2021.
  51. Understanding over-squashing and bottlenecks on graphs via curvature. In International Conference on Learning Representations, 2022. URL https://openreview.net/forum?id=7UmjRGzp-A.
  52. Graph attention networks. International Conference on Learning Representations, 2018.
  53. Order matters: Sequence to sequence for sets. International Conference on Learning Representations, 2016.
  54. Haar graph pooling. In International conference on machine learning, pp. 9952–9962. PMLR, 2020.
  55. Structural entropy guided graph hierarchical pooling. In International Conference on Machine Learning, pp. 24017–24030. PMLR, 2022.
  56. How powerful are graph neural networks? International Conference on Learning Representations, 2019.
  57. Groupinn: Grouping-based interpretable neural network for classification of limited, noisy brain data. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp.  772–782, 2019.
  58. Two sides of the same coin: Heterophily and oversmoothing in graph convolutional neural networks. In 2022 IEEE International Conference on Data Mining (ICDM), pp.  1287–1292. IEEE, 2022.
  59. Hierarchical graph representation learning with differentiable pooling. Advances in neural information processing systems, 31, 2018.
  60. Structpool: Structured graph pooling via conditional random fields. In Proceedings of the 8th International Conference on Learning Representations, 2020.
  61. Deep sets. Advances in neural information processing systems, 30, 2017.
  62. Structure-feature based graph self-adaptive pooling. In Proceedings of The Web Conference 2020, pp.  3098–3104, 2020.
  63. An end-to-end deep learning architecture for graph classification. In Proceedings of the AAAI conference on artificial intelligence, volume 32, 2018.
  64. Hierarchical graph pooling with structure learning. arXiv preprint arXiv:1911.05954, 2019.
  65. Pairnorm: Tackling oversmoothing in gnns. International Conference on Learning Representations, 2020.
  66. Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in Neural Information Processing Systems, 33:7793–7804, 2020.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yunchong Song (6 papers)
  2. Siyuan Huang (123 papers)
  3. Xinbing Wang (98 papers)
  4. Chenghu Zhou (55 papers)
  5. Zhouhan Lin (57 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets