Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 32 tok/s Pro
GPT-5 High 33 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 435 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

RoCP-GNN: Robust Conformal Prediction for Graph Neural Networks in Node-Classification (2408.13825v2)

Published 25 Aug 2024 in cs.LG, cs.AI, and stat.ML

Abstract: Graph Neural Networks (GNNs) have emerged as powerful tools for predicting outcomes in graph-structured data. However, a notable limitation of GNNs is their inability to provide robust uncertainty estimates, which undermines their reliability in contexts where errors are costly. One way to address this issue is by providing prediction sets that contain the true label with a predefined probability margin. Our approach builds upon conformal prediction (CP), a framework that promises to construct statistically robust prediction sets or intervals. There are two primary challenges: first, given dependent data like graphs, it is unclear whether the critical assumption in CP - exchangeability - still holds when applied to node classification. Second, even if the exchangeability assumption is valid for conformalized link prediction, we need to ensure high efficiency, i.e., the resulting prediction set or the interval length is small enough to provide useful information. In this article, we propose a novel approach termed Robust Conformal Prediction for GNNs (RoCP-GNN), which integrates conformal prediction (CP) directly into the GNN training process. This method generates prediction sets, instead of just point predictions, that are valid at a user-defined confidence level, assuming only exchangeability. Our approach robustly predicts outcomes with any predictive GNN model while quantifying the uncertainty in predictions within the realm of graph-based semi-supervised learning (SSL). Experimental results demonstrate that GNN models with size loss provide a statistically significant increase in performance. We validate our approach on standard graph benchmark datasets by coupling it with various state-of-the-art GNNs in node classification. The code will be made available after publication.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 2 likes.

Upgrade to Pro to view all of the tweets about this paper: