Papers
Topics
Authors
Recent
2000 character limit reached

Information gains from Monte Carlo Markov Chains (1904.11920v1)

Published 26 Apr 2019 in astro-ph.CO and hep-ex

Abstract: In this paper, we present a novel method for computing the relative entropy as well as the expected relative entropy using an MCMC chain. The relative entropy from information theory can be used to quantify differences in posterior distributions of a pair of experiments. In cosmology, the relative entropy has been proposed as an interesting tool for model selection, experiment design, forecasting and measuring information gain from subsequent experiments. In contrast to Gaussian distributions, these quantities are not generally available analytically and one needs to use numerical methods to estimate them which are certainly computationally expensive. We propose a method and provide its python package to estimate the relative entropy as well as expected relative entropy from a posterior sample. We consider the linear Gaussian model to check the accuracy of our code. Our results indicate that the relative error is below $0.2\%$ for sample size larger than $105$ in the linear Gaussian model. In addition, we study the robustness of our code in estimating the expected relative entropy in this model.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.