Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 63 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 102 tok/s Pro
Kimi K2 225 tok/s Pro
GPT OSS 120B 450 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Vendi Information Gain: An Alternative To Mutual Information For Science And Machine Learning (2505.09007v2)

Published 13 May 2025 in cs.IT and math.IT

Abstract: In his 1948 seminal paper A Mathematical Theory of Communication that birthed information theory, Claude Shannon introduced mutual information (MI), which he called "rate of transmission", as a way to quantify information gain (IG) and defined it as the difference between the marginal and conditional entropy of a random variable. While MI has become a standard tool in science and engineering, it has several shortcomings. First, MI is often intractable - it requires a density over samples with tractable Shannon entropy - and existing techniques for approximating it often fail, especially in high dimensions. Moreover, in settings where MI is tractable, its symmetry and insensitivity to sample similarity are undesirable. In this paper, we propose the Vendi Information Gain (VIG), a novel alternative to MI that leverages the Vendi scores, a flexible family of similarity-based diversity metrics. We call the logarithm of the VS the Vendi entropy and define VIG as the difference between the marginal and conditional Vendi entropy of a variable. Being based on the VS, VIG accounts for similarity. Furthermore, VIG generalizes MI and recovers it under the assumption that the samples are completely dissimilar. Importantly, VIG only requires samples and not a probability distribution over them. Finally, it is asymmetric, a desideratum for a good measure of IG that MI fails to meet. VIG extends information theory to settings where MI completely fails. For example, we use VIG to describe a novel, unified framework for active data acquisition, a popular paradigm of modern data-driven science. We demonstrate the advantages of VIG over MI in diverse applications, including in cognitive science to model human response times to external stimuli and in epidemiology to learn epidemic processes and identify disease hotspots in different countries via level-set estimation.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.