Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Density-Regression: Efficient and Distance-Aware Deep Regressor for Uncertainty Estimation under Distribution Shifts (2403.05600v1)

Published 7 Mar 2024 in cs.LG and stat.ML

Abstract: Morden deep ensembles technique achieves strong uncertainty estimation performance by going through multiple forward passes with different models. This is at the price of a high storage space and a slow speed in the inference (test) time. To address this issue, we propose Density-Regression, a method that leverages the density function in uncertainty estimation and achieves fast inference by a single forward pass. We prove it is distance aware on the feature space, which is a necessary condition for a neural network to produce high-quality uncertainty estimation under distribution shifts. Empirically, we conduct experiments on regression tasks with the cubic toy dataset, benchmark UCI, weather forecast with time series, and depth estimation under real-world shifted applications. We show that Density-Regression has competitive uncertainty estimation performance under distribution shifts with modern deep regressors while using a lower model size and a faster inference speed.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (59)
  1. Plex: Towards reliability using pretrained large model extensions, 2022.
  2. Uncertainty Baselines: Benchmarks for uncertainty & robustness in deep learning. arXiv preprint arXiv:2106.04015, 2021.
  3. On calibration of modern neural networks. In Proceedings of the 34th International Conference on Machine Learning, 2017.
  4. Accurate uncertainties for deep learning using calibrated regression. In Proceedings of the 35th International Conference on Machine Learning, 2018.
  5. Calibrated and sharp uncertainties in deep learning via density estimation. In Proceedings of the 39th International Conference on Machine Learning, 2022.
  6. Revisiting the calibration of modern neural networks. In Advances in Neural Information Processing Systems, 2021.
  7. Exploiting domain-specific features to enhance domain generalization. In Advances in Neural Information Processing Systems, 2021.
  8. Wilds: A benchmark of in-the-wild distribution shifts. In Proceedings of the 38th International Conference on Machine Learning, 2021.
  9. Simple and scalable predictive uncertainty estimation using deep ensembles. In Advances in Neural Information Processing Systems, 2017.
  10. Robust covariate shift regression. In Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, 2016.
  11. Conformalized quantile regression. In Advances in Neural Information Processing Systems, 2019.
  12. Simple and principled uncertainty estimation with deterministic deep learning via distance awareness. In Advances in Neural Information Processing Systems, 2020.
  13. On feature collapse and deep kernel learning for single forward pass uncertainty, 2022.
  14. Evidential deep learning to quantify classification uncertainty. In Advances in Neural Information Processing Systems, 2018.
  15. Natural posterior network: Deep bayesian predictive uncertainty for exponential family distributions. In International Conference on Learning Representations, 2022.
  16. Deep reinforcement learning in a handful of trials using probabilistic dynamics models. In Advances in Neural Information Processing Systems, 2018.
  17. Methods for comparing uncertainty quantifications for material property predictions. Machine Learning: Science and Technology, 2020.
  18. Probabilistic Forecasts, Calibration and Sharpness. Journal of the Royal Statistical Society Series B: Statistical Methodology, 2007.
  19. Density-softmax: Efficient test-time model for uncertainty estimation and robustness under distribution shifts, 2024.
  20. Posterior network: Uncertainty estimation without ood samples via density-based pseudo-counts. In Advances in Neural Information Processing Systems, 2020.
  21. Vladimir N. Vapnik. Statistical Learning Theory. Wiley-Interscience, 1998.
  22. Density estimation using real NVP. In International Conference on Learning Representations, 2017.
  23. Normalizing flows for probabilistic modeling and inference. Journal of Machine Learning Research, 2021.
  24. Probabilistic backpropagation for scalable learning of bayesian neural networks. In Proceedings of the 32nd International Conference on Machine Learning, 2015.
  25. TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. Software available from tensorflow.org.
  26. Deep evidential regression. In Advances in Neural Information Processing Systems, 2020.
  27. Modeling wine preferences by data mining from physicochemical properties. Decis. Support Syst., 2009.
  28. U-net: Convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015. Springer International Publishing, 2015.
  29. Indoor segmentation and support inference from rgbd images. In Computer Vision – ECCV 2012. Springer Berlin Heidelberg, 2012.
  30. Explaining and harnessing adversarial examples. In International Conference on Learning Representations, 2015.
  31. The apolloscape dataset for autonomous driving. In 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2018.
  32. Benchmark for uncertainty & robustness in self-supervised learning, 2022.
  33. Can you trust your model's uncertainty? evaluating predictive uncertainty under dataset shift. In Advances in Neural Information Processing Systems, 2019.
  34. Benchmarking neural network robustness to common corruptions and perturbations. In International Conference on Learning Representations, 2019.
  35. How reliable is your regression model’s uncertainty under real-world distribution shifts? Transactions on Machine Learning Research, 2023.
  36. A large-scale study of probabilistic calibration in neural network regression. In Proceedings of the 40th International Conference on Machine Learning, 2023.
  37. Gpytorch: Blackbox matrix-matrix gaussian process inference with gpu acceleration. In Advances in Neural Information Processing Systems, 2018.
  38. Deep neural networks as gaussian processes. In International Conference on Learning Representations, 2018.
  39. Weight uncertainty in neural networks. In Proceedings of the 32nd International Conference on Machine Learning, 2015.
  40. Flipout: Efficient pseudo-independent weight perturbations on mini-batches. In International Conference on Learning Representations, 2018.
  41. Dropout as a bayesian approximation: Representing model uncertainty in deep learning. In Proceedings of The 33rd International Conference on Machine Learning, 2016a.
  42. A theoretically grounded application of dropout in recurrent neural networks. In Advances in Neural Information Processing Systems, 2016b.
  43. Concrete dropout. In Advances in Neural Information Processing Systems, 2017.
  44. L.K. Hansen and P. Salamon. Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990.
  45. Batchensemble: an alternative approach to efficient ensemble and lifelong learning. In International Conference on Learning Representations, 2020.
  46. Efficient and scalable Bayesian neural nets with rank-1 factors. In Proceedings of the 37th International Conference on Machine Learning, 2020.
  47. Correlated input-dependent label noise in large-scale image classification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2021.
  48. Regression Quantiles. Econometrica, 1978.
  49. Distribution calibration for regression. In Proceedings of the 36th International Conference on Machine Learning, 2019.
  50. Algorithmic Learning in a Random World. Springer-Verlag, Berlin, Heidelberg, 2005. ISBN 0387001522.
  51. Conformal prediction under covariate shift. In Advances in Neural Information Processing Systems, 2019.
  52. Jaws: Auditing predictive uncertainty under covariate shift. In Advances in Neural Information Processing Systems, 2022.
  53. JAWS-x: Addressing efficiency bottlenecks of conformal prediction under standard and feedback covariate shift. In Proceedings of the 40th International Conference on Machine Learning, 2023.
  54. Beyond pinball loss: Quantile methods for calibrated uncertainty quantification. In Advances in Neural Information Processing Systems, 2021.
  55. Calibrating deep neural networks using focal loss. In Advances in Neural Information Processing Systems, 2020.
  56. Deep deterministic uncertainty: A new simple baseline. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023.
  57. Nonparametric uncertainty quantification for single deterministic neural network. In Advances in Neural Information Processing Systems, 2022.
  58. Do deep generative models know what they don’t know? In International Conference on Learning Representations, 2019.
  59. Estimating conditional quantiles with the help of the pinball loss. Bernoulli, 2011.
Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com