High-probability large-sample coverage for DNN-based calibration prediction intervals
Establish a high-probability large-sample conditional coverage guarantee for calibration prediction intervals constructed with deep neural network estimators: specifically, prove that there exists an integer N such that for all sample sizes n ≥ N, with probability approaching 1 over the training sample, the DNN-based calibration prediction intervals achieve conditional coverage at least 1−α for any fixed covariate value X_f = x_f, analogous to the guarantee proved for the kernel-based calibration prediction intervals.
References
Thus, we aim for a cPI that guarantees at least $1-\alpha$ coverage rate with a high probability when the sample size is large enough. We show that the cPI with kernel estimators exhibits such a feature. For cPI with DNN, we conjecture this property still holds since it can be thought of as an intermediate stage between the finite sample and asymptotic coverage guarantee.