Calibrating Uncertainties in Object Localization Task (1811.11210v1)
Abstract: In many safety-critical applications such as autonomous driving and surgical robots, it is desirable to obtain prediction uncertainties from object detection modules to help support safe decision-making. Specifically, such modules need to estimate the probability of each predicted object in a given region and the confidence interval for its bounding box. While recent Bayesian deep learning methods provide a principled way to estimate this uncertainty, the estimates for the bounding boxes obtained using these methods are uncalibrated. In this paper, we address this problem for the single-object localization task by adapting an existing technique for calibrating regression models. We show, experimentally, that the resulting calibrated model obtains more reliable uncertainty estimates.
- Buu Phan (13 papers)
- Rick Salay (17 papers)
- Krzysztof Czarnecki (65 papers)
- Vahdat Abdelzad (12 papers)
- Taylor Denouden (4 papers)
- Sachin Vernekar (5 papers)