Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Individual and Structural Graph Information Bottlenecks for Out-of-Distribution Generalization (2306.15902v1)

Published 28 Jun 2023 in cs.LG, cs.AI, and cs.CV

Abstract: Out-of-distribution (OOD) graph generalization are critical for many real-world applications. Existing methods neglect to discard spurious or noisy features of inputs, which are irrelevant to the label. Besides, they mainly conduct instance-level class-invariant graph learning and fail to utilize the structural class relationships between graph instances. In this work, we endeavor to address these issues in a unified framework, dubbed Individual and Structural Graph Information Bottlenecks (IS-GIB). To remove class spurious feature caused by distribution shifts, we propose Individual Graph Information Bottleneck (I-GIB) which discards irrelevant information by minimizing the mutual information between the input graph and its embeddings. To leverage the structural intra- and inter-domain correlations, we propose Structural Graph Information Bottleneck (S-GIB). Specifically for a batch of graphs with multiple domains, S-GIB first computes the pair-wise input-input, embedding-embedding, and label-label correlations. Then it minimizes the mutual information between input graph and embedding pairs while maximizing the mutual information between embedding and label pairs. The critical insight of S-GIB is to simultaneously discard spurious features and learn invariant features from a high-order perspective by maintaining class relationships under multiple distributional shifts. Notably, we unify the proposed I-GIB and S-GIB to form our complementary framework IS-GIB. Extensive experiments conducted on both node- and graph-level tasks consistently demonstrate the superior generalization ability of IS-GIB. The code is available at https://github.com/YangLing0818/GraphOOD.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (73)
  1. Z. Wu, B. Ramsundar, E. N. Feinberg, J. Gomes, C. Geniesse, A. S. Pappu, K. Leswing, and V. Pande, “Moleculenet: a benchmark for molecular machine learning,” Chemical science, vol. 9, no. 2, pp. 513–530, 2018.
  2. D. Easley, J. Kleinberg et al., “Networks, crowds, and markets,” Cambridge Books, 2012.
  3. S. Wu, F. Sun, W. Zhang, and B. Cui, “Graph neural networks in recommender systems: a survey,” arXiv preprint arXiv:2011.02260, 2020.
  4. Q. Wang, Z. Mao, B. Wang, and L. Guo, “Knowledge graph embedding: A survey of approaches and applications,” IEEE Transactions on Knowledge and Data Engineering, vol. 29, no. 12, pp. 2724–2743, 2017.
  5. T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” arXiv preprint arXiv:1609.02907, 2016.
  6. P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” arXiv preprint arXiv:1710.10903, 2017.
  7. K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How powerful are graph neural networks?” arXiv preprint arXiv:1810.00826, 2018.
  8. F. Wu, A. Souza, T. Zhang, C. Fifty, T. Yu, and K. Weinberger, “Simplifying graph convolutional networks,” in International conference on machine learning.   PMLR, 2019, pp. 6861–6871.
  9. L. Yang, L. Li, Z. Zhang, X. Zhou, E. Zhou, and Y. Liu, “Dpgn: Distribution propagation graph network for few-shot learning,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020, pp. 13 390–13 399.
  10. W. Fan, Y. Ma, Q. Li, J. Wang, G. Cai, J. Tang, and D. Yin, “A graph neural network framework for social recommendations,” IEEE Transactions on Knowledge and Data Engineering, 2020.
  11. W. Yu, X. Lin, J. Liu, J. Ge, W. Ou, and Z. Qin, “Self-propagation graph neural network for recommendation,” IEEE Transactions on Knowledge and Data Engineering, 2021.
  12. J. Li, H. Peng, Y. Cao, Y. Dou, H. Zhang, P. Yu, and L. He, “Higher-order attribute-enhancing heterogeneous graph neural networks,” IEEE Transactions on Knowledge and Data Engineering, 2021.
  13. X. Miao, W. Zhang, Y. Shao, B. Cui, L. Chen, C. Zhang, and J. Jiang, “Lasagne: A multi-layer graph convolutional network framework via node-aware deep architecture,” IEEE Transactions on Knowledge and Data Engineering, 2021.
  14. L. Yang and S. Hong, “Omni-granular ego-semantic propagation for self-supervised graph representation learning,” in International Conference on Machine Learning.   PMLR, 2022, pp. 25 022–25 037.
  15. Y. Cheng, L. Chen, Y. Yuan, G. Wang, B. Li, and F. Jin, “Strict and flexible rule-based graph repairing,” IEEE Transactions on Knowledge and Data Engineering, 2020.
  16. X. Shu, L. Zhang, Y. Sun, and J. Tang, “Host–parasite: Graph lstm-in-lstm for group activity recognition,” IEEE transactions on neural networks and learning systems, vol. 32, no. 2, pp. 663–674, 2020.
  17. X. Shu, B. Xu, L. Zhang, and J. Tang, “Multi-granularity anchor-contrastive representation learning for semi-supervised skeleton-based action recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
  18. L. Yang, Z. Huang, Y. Song, S. Hong, G. Li, W. Zhang, B. Cui, B. Ghanem, and M.-H. Yang, “Diffusion-based scene graph to image generation with masked contrastive pre-training,” arXiv preprint arXiv:2211.11138, 2022.
  19. J. Tang, X. Shu, R. Yan, and L. Zhang, “Coherence constrained graph lstm for group activity recognition,” IEEE transactions on pattern analysis and machine intelligence, vol. 44, no. 2, pp. 636–647, 2019.
  20. K. Zhan, C. Niu, C. Chen, F. Nie, C. Zhang, and Y. Yang, “Graph structure fusion for multiview clustering,” IEEE Transactions on Knowledge and Data Engineering, vol. 31, no. 10, pp. 1984–1993, 2018.
  21. K. Zhan, C. Zhang, J. Guan, and J. Wang, “Graph learning for multiview clustering,” IEEE transactions on cybernetics, vol. 48, no. 10, pp. 2887–2895, 2017.
  22. Z. Shen, J. Liu, Y. He, X. Zhang, R. Xu, H. Yu, and P. Cui, “Towards out-of-distribution generalization: A survey,” arXiv preprint arXiv:2108.13624, 2021.
  23. H. Li, X. Wang, Z. Zhang, and W. Zhu, “Out-of-distribution generalization on graphs: A survey,” arXiv preprint arXiv:2202.07987, 2022.
  24. M. Ding, K. Kong, J. Chen, J. Kirchenbauer, M. Goldblum, D. Wipf, F. Huang, and T. Goldstein, “A closer look at distribution shifts and out-of-distribution generalization on graphs,” 2021.
  25. Y. Ji, L. Zhang, J. Wu, B. Wu, L.-K. Huang, T. Xu, Y. Rong, L. Li, J. Ren, D. Xue et al., “Drugood: Out-of-distribution (ood) dataset curator and benchmark for ai-aided drug discovery–a focus on affinity prediction problems with noise annotations,” arXiv preprint arXiv:2201.09637, 2022.
  26. W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec, “Open graph benchmark: Datasets for machine learning on graphs,” Advances in neural information processing systems, vol. 33, pp. 22 118–22 133, 2020.
  27. S. Fakhraei, J. Foulds, M. Shashanka, and L. Getoor, “Collective spammer detection in evolving multi-relational social networks,” in Proceedings of the 21th acm sigkdd international conference on knowledge discovery and data mining, 2015, pp. 1769–1778.
  28. J. Qiu, J. Tang, H. Ma, Y. Dong, K. Wang, and J. Tang, “Deepinf: Social influence prediction with deep learning,” in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018, pp. 2110–2119.
  29. A. Pareja, G. Domeniconi, J. Chen, T. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. Schardl, and C. Leiserson, “Evolvegcn: Evolving graph convolutional networks for dynamic graphs,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 04, 2020, pp. 5363–5370.
  30. Y. Li, B. Qian, X. Zhang, and H. Liu, “Graph neural network-based diagnosis prediction,” Big Data, vol. 8, no. 5, pp. 379–390, 2020.
  31. L. Yang and S. Hong, “Unsupervised time-series representation learning with iterative bilinear temporal-spectral fusion,” in International Conference on Machine Learning.   PMLR, 2022, pp. 25 038–25 054.
  32. X. Han, X. Hu, H. Wu, B. Shen, and J. Wu, “Risk prediction of theft crimes in urban communities: An integrated model of lstm and st-gcn,” IEEE Access, vol. 8, pp. 217 222–217 230, 2020.
  33. Y. Yang, Z. Wei, Q. Chen, and L. Wu, “Using external knowledge for financial event prediction based on graph neural networks,” in Proceedings of the 28th ACM International Conference on Information and Knowledge Management, 2019, pp. 2161–2164.
  34. M. Arjovsky, L. Bottou, I. Gulrajani, and D. Lopez-Paz, “Invariant risk minimization,” arXiv preprint arXiv:1907.02893, 2019.
  35. S. Sagawa, P. W. Koh, T. B. Hashimoto, and P. Liang, “Distributionally robust neural networks for group shifts: On the importance of regularization for worst-case generalization,” arXiv preprint arXiv:1911.08731, 2019.
  36. K. Ahuja, E. Caballero, D. Zhang, J.-C. Gagnon-Audet, Y. Bengio, I. Mitliagkas, and I. Rish, “Invariance principle meets information bottleneck for out-of-distribution generalization,” Advances in Neural Information Processing Systems, vol. 34, 2021.
  37. H. Li, X. Wang, Z. Zhang, and W. Zhu, “Ood-gnn: Out-of-distribution generalized graph neural network,” IEEE Transactions on Knowledge and Data Engineering, 2022.
  38. B. Bevilacqua, Y. Zhou, and B. Ribeiro, “Size-invariant graph representations for graph classification extrapolations,” in International Conference on Machine Learning.   PMLR, 2021, pp. 837–851.
  39. Y. Chen, Y. Zhang, H. Yang, K. Ma, B. Xie, T. Liu, B. Han, and J. Cheng, “Invariance principle meets out-of-distribution generalization on graphs,” arXiv preprint arXiv:2202.05441, 2022.
  40. Y. Wu, X. Wang, A. Zhang, X. He, and T.-S. Chua, “Discovering invariant rationales for graph neural networks,” in International Conference on Learning Representations, 2021.
  41. Q. Zhu, N. Ponomareva, J. Han, and B. Perozzi, “Shift-robust gnns: Overcoming the limitations of localized graph training data,” Advances in Neural Information Processing Systems, vol. 34, 2021.
  42. G. Yehudai, E. Fetaya, E. Meirom, G. Chechik, and H. Maron, “From local structures to size generalization in graph neural networks,” in International Conference on Machine Learning.   PMLR, 2021, pp. 11 975–11 986.
  43. Q. Wu, H. Zhang, J. Yan, and D. Wipf, “Handling distribution shifts on graphs: An invariance perspective,” in International Conference on Learning Representations, 2021.
  44. N. Tishby, F. C. Pereira, and W. Bialek, “The information bottleneck method,” arXiv preprint physics/0004057, 2000.
  45. A. A. Alemi, I. Fischer, J. V. Dillon, and K. Murphy, “Deep variational information bottleneck,” arXiv preprint arXiv:1612.00410, 2016.
  46. Y. Luo, P. Liu, T. Guan, J. Yu, and Y. Yang, “Significance-aware information bottleneck for domain adaptive semantic segmentation,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2019, pp. 6778–6787.
  47. A. Zhang, Y. Gao, Y. Niu, W. Liu, and Y. Zhou, “Coarse-to-fine person re-identification with auxiliary-domain classification and second-order information bottleneck,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021, pp. 598–607.
  48. B. Li, Y. Shen, Y. Wang, W. Zhu, C. J. Reed, J. Zhang, D. Li, K. Keutzer, and H. Zhao, “Invariant information bottleneck for domain generalization,” arXiv preprint arXiv:2106.06333, 2021.
  49. R. Wang, X. He, R. Yu, W. Qiu, B. An, and Z. Rabinovich, “Learning efficient multi-agent communication: An information bottleneck approach,” in International Conference on Machine Learning.   PMLR, 2020, pp. 9908–9918.
  50. R. K. Mahabadi, Y. Belinkov, and J. Henderson, “Variational information bottleneck for effective low-resource fine-tuning,” arXiv preprint arXiv:2106.05469, 2021.
  51. M. Igl, K. Ciosek, Y. Li, S. Tschiatschek, C. Zhang, S. Devlin, and K. Hofmann, “Generalization in reinforcement learning with selective noise injection and information bottleneck,” Advances in neural information processing systems, vol. 32, 2019.
  52. G. Liu, X. Sun, O. Schulte, and P. Poupart, “Learning tree interpretation from object representation for deep reinforcement learning,” Advances in Neural Information Processing Systems, vol. 34, 2021.
  53. Z. Goldfeld and Y. Polyanskiy, “The information bottleneck problem and its applications in machine learning,” IEEE Journal on Selected Areas in Information Theory, vol. 1, no. 1, pp. 19–38, 2020.
  54. J. Yu, T. Xu, Y. Rong, Y. Bian, J. Huang, and R. He, “Graph information bottleneck for subgraph recognition,” in International Conference on Learning Representations, 2020.
  55. T. Wu, H. Ren, P. Li, and J. Leskovec, “Graph information bottleneck,” Advances in Neural Information Processing Systems, vol. 33, pp. 20 437–20 448, 2020.
  56. S. Chang, Y. Zhang, M. Yu, and T. Jaakkola, “Invariant rationalization,” in International Conference on Machine Learning.   PMLR, 2020, pp. 1448–1458.
  57. S. Zhang, K. Kuang, J. Qiu, J. Yu, Z. Zhao, H. Yang, Z. Zhang, and F. Wu, “Stable prediction on graphs with agnostic distribution shift,” arXiv preprint arXiv:2110.03865, 2021.
  58. M. Sugiyama, M. Krauledat, and K.-R. Müller, “Covariate shift adaptation by importance weighted cross validation.” Journal of Machine Learning Research, vol. 8, no. 5, 2007.
  59. J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in International conference on machine learning.   PMLR, 2017, pp. 1263–1272.
  60. Z. Ying, J. You, C. Morris, X. Ren, W. Hamilton, and J. Leskovec, “Hierarchical graph representation learning with differentiable pooling,” Advances in neural information processing systems, vol. 31, 2018.
  61. J. Yu, J. Cao, and R. He, “Improving subgraph recognition with variational graph information bottleneck,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 19 396–19 405.
  62. S. Miao, M. Liu, and P. Li, “Interpretable and generalizable graph learning via stochastic attention mechanism,” in International Conference on Machine Learning.   PMLR, 2022, pp. 15 524–15 543.
  63. Q. Wu, H. Zhang, J. Yan, and D. Wipf, “Towards distribution shift of node-level prediction on graphs: An invariance perspective,” in International Conference on Learning Representations, 2022.
  64. M. D. Donsker and S. S. Varadhan, “Asymptotic evaluation of certain markov process expectations for large time, i,” Communications on Pure and Applied Mathematics, vol. 28, no. 1, pp. 1–47, 1975.
  65. D. Krueger, E. Caballero, J.-H. Jacobsen, A. Zhang, J. Binas, D. Zhang, R. Le Priol, and A. Courville, “Out-of-distribution generalization via risk extrapolation (rex),” in International Conference on Machine Learning.   PMLR, 2021, pp. 5815–5826.
  66. M. Chen, Z. Wei, Z. Huang, B. Ding, and Y. Li, “Simple and deep graph convolutional networks,” in International Conference on Machine Learning.   PMLR, 2020, pp. 1725–1735.
  67. W. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” Advances in neural information processing systems, vol. 30, 2017.
  68. M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” Advances in neural information processing systems, vol. 29, 2016.
  69. Y. Dou, K. Shu, C. Xia, P. S. Yu, and L. Sun, “User preference-aware fake news detection,” in Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2021, pp. 2051–2055.
  70. D. Lim, X. Li, F. Hohne, and S.-N. Lim, “New benchmarks for learning on non-homophilous graphs,” arXiv preprint arXiv:2104.01404, 2021.
  71. W. Zellinger, T. Grubinger, E. Lughofer, T. Natschläger, and S. Saminger-Platz, “Central moment discrepancy (cmd) for domain-invariant representation learning,” arXiv preprint arXiv:1702.08811, 2017.
  72. A. Gretton, K. Borgwardt, M. Rasch, B. Schölkopf, and A. Smola, “A kernel method for the two-sample-problem,” Advances in neural information processing systems, vol. 19, 2006.
  73. L. Yang, Z. Zhang, Y. Song, S. Hong, R. Xu, Y. Zhao, Y. Shao, W. Zhang, B. Cui, and M.-H. Yang, “Diffusion models: A comprehensive survey of methods and applications,” arXiv preprint arXiv:2209.00796, 2022.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Ling Yang (88 papers)
  2. Jiayi Zheng (7 papers)
  3. Heyuan Wang (3 papers)
  4. Zhongyi Liu (19 papers)
  5. Zhilin Huang (9 papers)
  6. Shenda Hong (56 papers)
  7. Wentao Zhang (261 papers)
  8. Bin Cui (165 papers)
Citations (9)