Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Bayesian Neural Network based on Dropout Regulation (2102.01968v1)

Published 3 Feb 2021 in cs.LG, cs.AI, and cs.NE

Abstract: Bayesian Neural Networks (BNN) have recently emerged in the Deep Learning world for dealing with uncertainty estimation in classification tasks, and are used in many application domains such as astrophysics, autonomous driving...BNN assume a prior over the weights of a neural network instead of point estimates, enabling in this way the estimation of both aleatoric and epistemic uncertainty of the model prediction.Moreover, a particular type of BNN, namely MC Dropout, assumes a Bernoulli distribution on the weights by using Dropout.Several attempts to optimize the dropout rate exist, e.g. using a variational approach.In this paper, we present a new method called "Dropout Regulation" (DR), which consists of automatically adjusting the dropout rate during training using a controller as used in automation.DR allows for a precise estimation of the uncertainty which is comparable to the state-of-the-art while remaining simple to implement.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Claire Theobald (3 papers)
  2. Frédéric Pennerath (5 papers)
  3. Brieuc Conan-Guez (12 papers)
  4. Miguel Couceiro (61 papers)
  5. Amedeo Napoli (25 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.