Papers
Topics
Authors
Recent
Search
2000 character limit reached

Bayesian Neural Networks with Soft Evidence

Published 19 Oct 2020 in cs.LG and stat.ML | (2010.09570v2)

Abstract: Bayes's rule deals with hard evidence, that is, we can calculate the probability of event $A$ occuring given that event $B$ has occurred. Soft evidence, on the other hand, involves a degree of uncertainty about whether event $B$ has actually occurred or not. Jeffrey's rule of conditioning provides a way to update beliefs in the case of soft evidence. We provide a framework to learn a probability distribution on the weights of a neural network trained using soft evidence by way of two simple algorithms for approximating Jeffrey conditionalization. We propose an experimental protocol for benchmarking these algorithms on empirical datasets and find that Jeffrey based methods are competitive or better in terms of accuracy yet show improvements in calibration metrics upwards of 20% in some cases, even when the data contains mislabeled points.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.