Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 78 tok/s
Gemini 2.5 Pro 43 tok/s Pro
GPT-5 Medium 23 tok/s
GPT-5 High 29 tok/s Pro
GPT-4o 93 tok/s
GPT OSS 120B 470 tok/s Pro
Kimi K2 183 tok/s Pro
2000 character limit reached

Confidence curves for UQ validation: probabilistic reference vs. oracle (2206.15272v2)

Published 30 Jun 2022 in physics.data-an

Abstract: Confidence curves are used in uncertainty validation to assess how large uncertainties ($u_{E}$) are associated with large errors ($E$). An oracle curve is commonly used as reference to estimate the quality of the tested datasets. The oracle is a perfect, deterministic, error predictor, such as $|E|=\pm u_{E}$, which corresponds to a very unlikely error distribution in a probabilistic framework and is unable unable to inform us on the calibration of $u_{E}$. I propose here to replace the oracle by a probabilistic reference curve, deriving from the more realistic scenario where errors should be random draws from a distribution with standard deviation $u_{E}$. The probabilistic curve and its confidence interval enable a direct test of the quality of a confidence curve. Paired with the probabilistic reference, a confidence curve can be used to check the calibration and tightness of prediction uncertainties.

Citations (7)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)