Papers
Topics
Authors
Recent
Search
2000 character limit reached

Conformal Prediction = Bayes?

Published 29 Dec 2025 in math.ST and stat.ME | (2512.23308v1)

Abstract: Conformal prediction (CP) is widely presented as distribution-free predictive inference with finite-sample marginal coverage under exchangeability. We argue that CP is best understood as a rank-calibrated descendant of the Fisher-Dempster-Hill fiducial/direct-probability tradition rather than as Bayesian conditioning in disguise. We establish four separations from coherent countably additive predictive semantics. First, canonical conformal constructions violate conditional extensionality: prediction sets can depend on the marginal design P(X) even when P(Y|X) is fixed. Second, any finitely additive sequential extension preserving rank calibration is nonconglomerable, implying countable Dutch-book vulnerabilities. Third, rank-calibrated updates cannot be realized as regular conditionals of any countably additive exchangeable law on Yinfty. Fourth, formalizing both paradigms as families of one-step predictive kernels, conformal and Bayesian kernels coincide only on a Baire-meagre subset of the space of predictive laws. We further show that rank- and proxy-based reductions are generically Blackwell-deficient relative to full-data experiments, yielding positive Le Cam deficiency for suitable losses. Extending the analysis to prediction-powered inference (PPI) yields an analogous message: bias-corrected, proxy-rectified estimators can be valid as confidence devices while failing to define transportable belief states across stages, shifts, or adaptive selection. Together, the results sharpen a general limitation of wrappers: finite-sample calibration guarantees do not by themselves supply composable semantics for sequential updating or downstream decision-making.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.