Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Shannon entropy for imprecise and under-defined or over-defined information (1709.04729v1)

Published 14 Sep 2017 in cs.IT and math.IT

Abstract: Shannon entropy was defined for probability distributions and then its using was expanded to measure the uncertainty of knowledge for systems with complete information. In this article, it is proposed to extend the using of Shannon entropy to under-defined or over-defined information systems. To be able to use Shannon entropy, the information is normalized by an affine transformation. The construction of affine transformation is done in two stages: one for homothety and another for translation. Moreover, the case of information with a certain degree of imprecision was included in this approach. Besides, the article shows the using of Shannon entropy for some particular cases such as: neutrosophic information both in the trivalent and bivalent case, bifuzzy information, intuitionistic fuzzy information, imprecise fuzzy information, and fuzzy partitions.

Citations (3)

Summary

We haven't generated a summary for this paper yet.