Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Controlled Dropout for Uncertainty Estimation (2205.03109v1)

Published 6 May 2022 in cs.LG and cs.CV

Abstract: Uncertainty quantification in a neural network is one of the most discussed topics for safety-critical applications. Though Neural Networks (NNs) have achieved state-of-the-art performance for many applications, they still provide unreliable point predictions, which lack information about uncertainty estimates. Among various methods to enable neural networks to estimate uncertainty, Monte Carlo (MC) dropout has gained much popularity in a short period due to its simplicity. In this study, we present a new version of the traditional dropout layer where we are able to fix the number of dropout configurations. As such, each layer can take and apply the new dropout layer in the MC method to quantify the uncertainty associated with NN predictions. We conduct experiments on both toy and realistic datasets and compare the results with the MC method using the traditional dropout layer. Performance analysis utilizing uncertainty evaluation metrics corroborates that our dropout layer offers better performance in most cases.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Mehedi Hasan (30 papers)
  2. Abbas Khosravi (43 papers)
  3. Ibrahim Hossain (3 papers)
  4. Ashikur Rahman (6 papers)
  5. Saeid Nahavandi (61 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.