Density-Regression: Efficient and Distance-Aware Deep Regressor for Uncertainty Estimation under Distribution Shifts (2403.05600v1)
Abstract: Morden deep ensembles technique achieves strong uncertainty estimation performance by going through multiple forward passes with different models. This is at the price of a high storage space and a slow speed in the inference (test) time. To address this issue, we propose Density-Regression, a method that leverages the density function in uncertainty estimation and achieves fast inference by a single forward pass. We prove it is distance aware on the feature space, which is a necessary condition for a neural network to produce high-quality uncertainty estimation under distribution shifts. Empirically, we conduct experiments on regression tasks with the cubic toy dataset, benchmark UCI, weather forecast with time series, and depth estimation under real-world shifted applications. We show that Density-Regression has competitive uncertainty estimation performance under distribution shifts with modern deep regressors while using a lower model size and a faster inference speed.
- Plex: Towards reliability using pretrained large model extensions, 2022.
- Uncertainty Baselines: Benchmarks for uncertainty & robustness in deep learning. arXiv preprint arXiv:2106.04015, 2021.
- On calibration of modern neural networks. In Proceedings of the 34th International Conference on Machine Learning, 2017.
- Accurate uncertainties for deep learning using calibrated regression. In Proceedings of the 35th International Conference on Machine Learning, 2018.
- Calibrated and sharp uncertainties in deep learning via density estimation. In Proceedings of the 39th International Conference on Machine Learning, 2022.
- Revisiting the calibration of modern neural networks. In Advances in Neural Information Processing Systems, 2021.
- Exploiting domain-specific features to enhance domain generalization. In Advances in Neural Information Processing Systems, 2021.
- Wilds: A benchmark of in-the-wild distribution shifts. In Proceedings of the 38th International Conference on Machine Learning, 2021.
- Simple and scalable predictive uncertainty estimation using deep ensembles. In Advances in Neural Information Processing Systems, 2017.
- Robust covariate shift regression. In Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, 2016.
- Conformalized quantile regression. In Advances in Neural Information Processing Systems, 2019.
- Simple and principled uncertainty estimation with deterministic deep learning via distance awareness. In Advances in Neural Information Processing Systems, 2020.
- On feature collapse and deep kernel learning for single forward pass uncertainty, 2022.
- Evidential deep learning to quantify classification uncertainty. In Advances in Neural Information Processing Systems, 2018.
- Natural posterior network: Deep bayesian predictive uncertainty for exponential family distributions. In International Conference on Learning Representations, 2022.
- Deep reinforcement learning in a handful of trials using probabilistic dynamics models. In Advances in Neural Information Processing Systems, 2018.
- Methods for comparing uncertainty quantifications for material property predictions. Machine Learning: Science and Technology, 2020.
- Probabilistic Forecasts, Calibration and Sharpness. Journal of the Royal Statistical Society Series B: Statistical Methodology, 2007.
- Density-softmax: Efficient test-time model for uncertainty estimation and robustness under distribution shifts, 2024.
- Posterior network: Uncertainty estimation without ood samples via density-based pseudo-counts. In Advances in Neural Information Processing Systems, 2020.
- Vladimir N. Vapnik. Statistical Learning Theory. Wiley-Interscience, 1998.
- Density estimation using real NVP. In International Conference on Learning Representations, 2017.
- Normalizing flows for probabilistic modeling and inference. Journal of Machine Learning Research, 2021.
- Probabilistic backpropagation for scalable learning of bayesian neural networks. In Proceedings of the 32nd International Conference on Machine Learning, 2015.
- TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. Software available from tensorflow.org.
- Deep evidential regression. In Advances in Neural Information Processing Systems, 2020.
- Modeling wine preferences by data mining from physicochemical properties. Decis. Support Syst., 2009.
- U-net: Convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015. Springer International Publishing, 2015.
- Indoor segmentation and support inference from rgbd images. In Computer Vision – ECCV 2012. Springer Berlin Heidelberg, 2012.
- Explaining and harnessing adversarial examples. In International Conference on Learning Representations, 2015.
- The apolloscape dataset for autonomous driving. In 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2018.
- Benchmark for uncertainty & robustness in self-supervised learning, 2022.
- Can you trust your model's uncertainty? evaluating predictive uncertainty under dataset shift. In Advances in Neural Information Processing Systems, 2019.
- Benchmarking neural network robustness to common corruptions and perturbations. In International Conference on Learning Representations, 2019.
- How reliable is your regression model’s uncertainty under real-world distribution shifts? Transactions on Machine Learning Research, 2023.
- A large-scale study of probabilistic calibration in neural network regression. In Proceedings of the 40th International Conference on Machine Learning, 2023.
- Gpytorch: Blackbox matrix-matrix gaussian process inference with gpu acceleration. In Advances in Neural Information Processing Systems, 2018.
- Deep neural networks as gaussian processes. In International Conference on Learning Representations, 2018.
- Weight uncertainty in neural networks. In Proceedings of the 32nd International Conference on Machine Learning, 2015.
- Flipout: Efficient pseudo-independent weight perturbations on mini-batches. In International Conference on Learning Representations, 2018.
- Dropout as a bayesian approximation: Representing model uncertainty in deep learning. In Proceedings of The 33rd International Conference on Machine Learning, 2016a.
- A theoretically grounded application of dropout in recurrent neural networks. In Advances in Neural Information Processing Systems, 2016b.
- Concrete dropout. In Advances in Neural Information Processing Systems, 2017.
- L.K. Hansen and P. Salamon. Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990.
- Batchensemble: an alternative approach to efficient ensemble and lifelong learning. In International Conference on Learning Representations, 2020.
- Efficient and scalable Bayesian neural nets with rank-1 factors. In Proceedings of the 37th International Conference on Machine Learning, 2020.
- Correlated input-dependent label noise in large-scale image classification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2021.
- Regression Quantiles. Econometrica, 1978.
- Distribution calibration for regression. In Proceedings of the 36th International Conference on Machine Learning, 2019.
- Algorithmic Learning in a Random World. Springer-Verlag, Berlin, Heidelberg, 2005. ISBN 0387001522.
- Conformal prediction under covariate shift. In Advances in Neural Information Processing Systems, 2019.
- Jaws: Auditing predictive uncertainty under covariate shift. In Advances in Neural Information Processing Systems, 2022.
- JAWS-x: Addressing efficiency bottlenecks of conformal prediction under standard and feedback covariate shift. In Proceedings of the 40th International Conference on Machine Learning, 2023.
- Beyond pinball loss: Quantile methods for calibrated uncertainty quantification. In Advances in Neural Information Processing Systems, 2021.
- Calibrating deep neural networks using focal loss. In Advances in Neural Information Processing Systems, 2020.
- Deep deterministic uncertainty: A new simple baseline. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023.
- Nonparametric uncertainty quantification for single deterministic neural network. In Advances in Neural Information Processing Systems, 2022.
- Do deep generative models know what they don’t know? In International Conference on Learning Representations, 2019.
- Estimating conditional quantiles with the help of the pinball loss. Bernoulli, 2011.