Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 52 tok/s
Gemini 2.5 Pro 55 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 107 tok/s Pro
Kimi K2 216 tok/s Pro
GPT OSS 120B 468 tok/s Pro
Claude Sonnet 4 30 tok/s Pro
2000 character limit reached

Dependence and Uncertainty: Information Measures using Tsallis Entropy (2502.12779v1)

Published 18 Feb 2025 in stat.ME, cs.IT, and math.IT

Abstract: In multivariate analysis, uncertainty arises from two sources: the marginal distributions of the variables and their dependence structure. Quantifying the dependence structure is crucial, as it provides valuable insights into the relationships among components of a random vector. Copula functions effectively capture this dependence structure independent of marginals, making copula-based information measures highly significant. However, existing copula-based information measures, such as entropy, divergence, and mutual information, rely on copula densities, which may not exist in many scenarios, limiting their applicability. Recently, to address this issue, Arshad et al. (2024) introduced cumulative copula-based measures using Shannon entropy. In this paper, we extend this framework by using Tsallis entropy, a non-additive entropy that provides greater flexibility for quantifying uncertainties. We propose cumulative copula Tsallis entropy, derive its properties and bounds, and illustrate its utility through examples. We further develop a non-parametric version of the measure and validate it using coupled periodic and chaotic maps. Additionally, we extend Kerridge's inaccuracy measure and Kullback-Leibler (KL) divergence to the cumulative copula framework. Using the relationship between KL divergence and mutual information, we propose a new cumulative mutual information (CMI) measure, which outperform the limitations of density-based mutual information. Furthermore, we introduce a test procedure for testing the mutual independence among random variables using CMI measure. Finally, we illustrate the potential of the proposed CMI measure as an economic indicator through real bivariate financial time series data.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube