Papers
Topics
Authors
Recent
Search
2000 character limit reached

Distribution Free Prediction Sets for Node Classification

Published 26 Nov 2022 in stat.ML and cs.LG | (2211.14555v3)

Abstract: Graph Neural Networks (GNNs) are able to achieve high classification accuracy on many important real world datasets, but provide no rigorous notion of predictive uncertainty. Quantifying the confidence of GNN models is difficult due to the dependence between datapoints induced by the graph structure. We leverage recent advances in conformal prediction to construct prediction sets for node classification in inductive learning scenarios. We do this by taking an existing approach for conformal classification that relies on \textit{exchangeable} data and modifying it by appropriately weighting the conformal scores to reflect the network structure. We show through experiments on standard benchmark datasets using popular GNN models that our approach provides tighter and better calibrated prediction sets than a naive application of conformal prediction.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (36)
  1. Fast graph representation learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
  2. Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315, 2019.
  3. Algorithmic Learning in a Random World. Springer-Verlag, Berlin, Heidelberg, 2005. ISBN 0387001522.
  4. Uncertainty sets for image classifiers using conformal prediction. arXiv preprint arXiv:2009.14193, 2020.
  5. Conformal prediction beyond exchangeability, 2022. URL https://arxiv.org/abs/2202.13415.
  6. Bayesian neural networks: An introduction and survey. In Case Studies in Applied Bayesian Data Science, pages 45–87. Springer International Publishing, 2020. doi: 10.1007/978-3-030-42553-1˙3. URL https://doi.org/10.1007%2F978-3-030-42553-1_3.
  7. Simple and scalable predictive uncertainty estimation using deep ensembles. Advances in Neural Information Processing Systems, 30, 2017.
  8. Auto-encoding variational bayes. ICLR, 2014.
  9. Bayesian graph neural networks with adaptive connection sampling. In International conference on machine learning, pages 4094–4104. PMLR, 2020.
  10. Bayesian graph convolutional neural networks via tempered mcmc. IEEE Access, 9:130353–130365, 2021.
  11. Bayesian graph convolutional neural networks for semi-supervised classification. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 33, pages 5829–5836, 2019.
  12. Learning optimal conformal classifiers. In International Conference on Learning Representations, 2022.
  13. Training uncertainty-aware classifiers with conformalized deep learning. In Advances in Neural Information Processing Systems, 2022.
  14. Predictive inference with feature conformal prediction. arXiv preprint arXiv:2210.00173, 2022.
  15. Classification with valid and adaptive coverage. In H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 3581–3591. Curran Associates, Inc., 2020. URL https://proceedings.neurips.cc/paper/2020/file/244edd7e85dc81602b7615cd705545f5-Paper.pdf.
  16. Least ambiguous set-valued classifiers with bounded error levels. Journal of the American Statistical Association, 114(525):223–234, 2019.
  17. Distribution-free predictive inference for regression. Journal of the American Statistical Association, 113(523):1094–1111, 2018.
  18. Conformalized quantile regression. Advances in Neural Information Processing Systems, 32, 2019.
  19. Conformal risk control, 2022. URL https://arxiv.org/abs/2208.02814.
  20. Adaptive conformal inference under distribution shift. Advances in Neural Information Processing Systems, 34:1660–1672, 2021.
  21. Adaptive conformal predictions for time series. In International Conference on Machine Learning, pages 25834–25866. PMLR, 2022.
  22. Conformal inference for online prediction with arbitrary distribution shifts, 2022. URL https://arxiv.org/abs/2208.08401.
  23. Practical adversarial multivalid conformal prediction. In Advances in Neural Information Processing Systems, 2022.
  24. Inductive confidence machines for regression. volume 2430, pages 185–194, 08 2002. ISBN 978-3-540-44036-9. doi: 10.1007/3-540-36755-1˙29.
  25. A gentle introduction to conformal prediction and distribution-free uncertainty quantification, 2021. URL https://arxiv.org/abs/2107.07511.
  26. Birds of a feather: Homophily in social networks. Annual Review of Sociology, 27(1):415–444, 2001. doi: 10.1146/annurev.soc.27.1.415. URL https://doi.org/10.1146/annurev.soc.27.1.415.
  27. Graph neural networks with heterophily. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 11168–11176, 2021.
  28. GraphSAINT: Graph sampling based inductive learning method. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=BJe8pkHFwS.
  29. Pitfalls of graph neural network evaluation, 2018. URL https://arxiv.org/abs/1811.05868.
  30. Inductive representation learning on large graphs. In I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017. URL https://proceedings.neurips.cc/paper/2017/file/5dd9db5e033da9c6fb5ba83c7a7ebea9-Paper.pdf.
  31. Decoupling the depth and scope of graph neural networks. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems, 2021. URL https://openreview.net/forum?id=d0MtHWY0NZ.
  32. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR), 2017.
  33. Geom-gcn: Geometric graph convolutional networks. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=S1e2agrFvS.
  34. The limits of distribution-free conditional predictive inference, 2019. URL https://arxiv.org/abs/1903.04684.
  35. Distribution-free prediction sets for two-layer hierarchical models. Journal of the American Statistical Association, 0(0):1–12, 2022. doi: 10.1080/01621459.2022.2060112. URL https://doi.org/10.1080/01621459.2022.2060112.
  36. Adam: A method for stochastic optimization. International Conference on Learning Representations, 12 2014.
Citations (18)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.