Papers
Topics
Authors
Recent
2000 character limit reached

Minimizing and Maximizing the Shannon Entropy for Fixed Marginals (2509.05099v1)

Published 5 Sep 2025 in math.OC

Abstract: The mutual information (MI) between two random variables is an important correlation measure in data analysis. The Shannon entropy of a joint probability distribution is the variable part under fixed marginals. We aim to minimize and maximize it to obtain the largest and smallest MI possible in this case, leading to a scaled MI ratio for better comparability. We present algorithmic approaches and optimal solutions for a set of problem instances based on data from molecular evolution. We show that this allows us to construct a sensible, systematic correction to raw MI values.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: