Papers
Topics
Authors
Recent
2000 character limit reached

Training-Conditional Coverage Bounds for Uniformly Stable Learning Algorithms (2404.13731v1)

Published 21 Apr 2024 in stat.ML and cs.LG

Abstract: The training-conditional coverage performance of the conformal prediction is known to be empirically sound. Recently, there have been efforts to support this observation with theoretical guarantees. The training-conditional coverage bounds for jackknife+ and full-conformal prediction regions have been established via the notion of $(m,n)$-stability by Liang and Barber~[2023]. Although this notion is weaker than uniform stability, it is not clear how to evaluate it for practical models. In this paper, we study the training-conditional coverage bounds of full-conformal, jackknife+, and CV+ prediction regions from a uniform stability perspective which is known to hold for empirical risk minimization over reproducing kernel Hilbert spaces with convex regularization. We derive coverage bounds for finite-dimensional models by a concentration argument for the (estimated) predictor function, and compare the bounds with existing ones under ridge regression.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (15)
  1. G. Shafer and V. Vovk, “A tutorial on conformal prediction.” Journal of Machine Learning Research, vol. 9, no. 3, 2008.
  2. V. Vovk, I. Nouretdinov, and A. Gammerman, “On-line predictive linear regression,” The Annals of Statistics, pp. 1566–1590, 2009.
  3. V. Vovk, “Conditional validity of inductive conformal predictors,” in Asian conference on machine learning.   PMLR, 2012, pp. 475–490.
  4. R. Foygel Barber, E. J. Candes, A. Ramdas, and R. J. Tibshirani, “The limits of distribution-free conditional predictive inference,” Information and Inference: A Journal of the IMA, vol. 10, no. 2, pp. 455–482, 2021.
  5. J. Lei and L. Wasserman, “Distribution-free prediction bands for non-parametric regression,” Journal of the Royal Statistical Society Series B: Statistical Methodology, vol. 76, no. 1, pp. 71–96, 2014.
  6. C. Jung, G. Noarov, R. Ramalingam, and A. Roth, “Batch multivalid conformal prediction,” arXiv preprint arXiv:2209.15145, 2022.
  7. I. Gibbs, J. J. Cherian, and E. J. Candès, “Conformal prediction with conditional guarantees,” arXiv preprint arXiv:2305.12616, 2023.
  8. V. Vovk, D. Lindsay, I. Nouretdinov, and A. Gammerman, “Mondrian confidence machine,” Technical Report, 2003.
  9. M. Bian and R. F. Barber, “Training-conditional coverage for distribution-free predictive inference,” Electronic Journal of Statistics, vol. 17, no. 2, pp. 2044–2066, 2023.
  10. R. Liang and R. F. Barber, “Algorithmic stability implies training-conditional coverage for distribution-free prediction methods,” arXiv preprint arXiv:2311.04295, 2023.
  11. O. Bousquet and A. Elisseeff, “Stability and generalization,” The Journal of Machine Learning Research, vol. 2, pp. 499–526, 2002.
  12. R. F. Barber, E. J. Candes, A. Ramdas, and R. J. Tibshirani, “Predictive inference with the jackknife+,” 2021.
  13. A. E. Hoerl and R. W. Kennard, “Ridge regression: Biased estimation for nonorthogonal problems,” Technometrics, vol. 12, no. 1, pp. 55–67, 1970.
  14. C. McDiarmid et al., “On the method of bounded differences,” Surveys in Combinatorics, vol. 141, no. 1, pp. 148–188, 1989.
  15. A. Dvoretzky, J. Kiefer, and J. Wolfowitz, “Asymptotic minimax character of the sample distribution function and of the classical multinomial estimator,” The Annals of Mathematical Statistics, pp. 642–669, 1956.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Video Overview

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 1 like about this paper.