Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian graph convolutional neural networks for semi-supervised classification (1811.11103v1)

Published 27 Nov 2018 in stat.ML and cs.LG

Abstract: Recently, techniques for applying convolutional neural networks to graph-structured data have emerged. Graph convolutional neural networks (GCNNs) have been used to address node and graph classification and matrix completion. Although the performance has been impressive, the current implementations have limited capability to incorporate uncertainty in the graph structure. Almost all GCNNs process a graph as though it is a ground-truth depiction of the relationship between nodes, but often the graphs employed in applications are themselves derived from noisy data or modelling assumptions. Spurious edges may be included; other edges may be missing between nodes that have very strong relationships. In this paper we adopt a Bayesian approach, viewing the observed graph as a realization from a parametric family of random graphs. We then target inference of the joint posterior of the random graph parameters and the node (or graph) labels. We present the Bayesian GCNN framework and develop an iterative learning procedure for the case of assortative mixed-membership stochastic block models. We present the results of experiments that demonstrate that the Bayesian formulation can provide better performance when there are very few labels available during the training process.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yingxue Zhang (72 papers)
  2. Soumyasundar Pal (16 papers)
  3. Mark Coates (75 papers)
  4. Deniz Üstebay (2 papers)
Citations (212)

Summary

  • The paper presents a Bayesian GCNN that integrates uncertainty in graph structure through joint inference of network weights and graph parameters.
  • It employs a stochastic block model to effectively handle noisy and incomplete data, improving classification accuracy with limited labels.
  • Experimental results on Cora, CiteSeer, and Pubmed datasets show superior robustness against adversarial graph perturbations.

Bayesian Graph Convolutional Neural Networks for Semi-Supervised Classification

The presented paper introduces a novel framework that combines Bayesian inference with graph convolutional neural networks (GCNNs) to enhance the capability of these networks in handling uncertainty in graph-structured data, specifically for the task of semi-supervised classification. The authors focus on a Bayesian GCNN formulation that integrates an assortative mixed-membership stochastic block model (a-MMSBM) for modeling the underlying graph structure.

Theoretical Framework and Methodology

The traditional GCNNs, while effective in various applications such as node classification and link prediction, generally treat graph structures as definitive and error-free. This assumption can lead to inaccuracies when the graph originates from noisy or incomplete data, potentially missing significant relationships or incorporating spurious ones. To address this challenge, the authors adopt a Bayesian perspective, considering the graph as a parametric instantiation of a random graph model. This approach enhances the robustness of the GCNN to uncertainties present in graph data by facilitating joint inference of the graph parameters, network weights, and node labels.

The Bayesian framework involves sampling from a posterior distribution that includes not only the standard neural network weights but also parameters representing the graph structure. The resulting model predicts node labels by marginalizing over these posterior samples, effectively capturing the uncertainty inherent in both the graph and the model predictions.

Experimental Evaluation

The empirical evaluation demonstrates the efficacy of Bayesian GCNNs over conventional methods, especially under conditions with limited labeled data. The authors conduct experiments on benchmark citation datasets—Cora, CiteSeer, and Pubmed—employing both fixed and random splits. The Bayesian GCNN showed notable improvements in classification accuracy, most prominently when fewer labeled samples were provided. The adaptability of the Bayesian framework is attributed to its ability to infer more informative representations from sparse labeled data by effectively modeling the uncertainties.

Furthermore, the robustness of the Bayesian GCNN is assessed under random adversarial attacks, specifically designed to perturb the graph structure. Compared to non-Bayesian GCNNs, the Bayesian approach demonstrates a significant resilience to such disruptions, maintaining higher classification accuracy and offering increased stability in prediction margins despite structural perturbations.

Implications and Future Directions

The incorporation of Bayesian inference into GCNNs opens new pathways for developing more resilient graph-based learning models that are better equipped to handle data uncertainty and noise. This approach not only provides a mechanism for enhancing the robustness of GCNNs but also offers theoretic insights into how uncertainty can be systematically integrated into graph learning algorithms.

Future research could explore various extensions and adaptations of this framework. It would be particularly interesting to expand the Bayesian GCNN approach to other graph models beyond a-MMSBM, allowing for application-specific adaptations, and to investigate its applicability to a broader set of tasks such as dynamic graph analysis or scalable learning scenarios in large-scale graphs.

In conclusion, the paper provides a significant contribution to the field by addressing a critical limitation of existing graph neural networks and offers a well-grounded methodology for incorporating uncertainty into graph learning processes. This Bayesian approach paves the way for further explorations in robust and interpretable graph-based learning systems.