A Tight Lower Bound on the Mutual Information of a Binary and an Arbitrary Finite Random Variable in Dependence of the Variational Distance
Abstract: In this paper a numerical method is presented, which finds a lower bound for the mutual information between a binary and an arbitrary finite random variable with joint distributions that have a variational distance not greater than a known value to a known joint distribution. This lower bound can be applied to mutual information estimation with confidence intervals.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.