Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Properties of Shannon and Rényi entropies of the Poisson distribution as the functions of intensity parameter (2403.08805v1)

Published 6 Feb 2024 in cs.IT, math.IT, and math.PR

Abstract: We consider two types of entropy, namely, Shannon and R\'{e}nyi entropies of the Poisson distribution, and establish their properties as the functions of intensity parameter. More precisely, we prove that both entropies increase with intensity. While for Shannon entropy the proof is comparatively simple, for R\'{e}nyi entropy, which depends on additional parameter $\alpha>0$, we can characterize it as nontrivial. The proof is based on application of Karamata's inequality to the terms of Poisson distribution.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (20)
  1. Estimating Rényi entropy of discrete distributions. IEEE Trans. Inform. Theory, 63(1):38–56, 2017.
  2. Effectiveness of an entropy-based approach for detecting low- and high-rate DDoS attacks against the SDN controller: Experimental analysis. Applied Sciences, 13(2), 2023.
  3. Bayesian entropy estimation for countable discrete distributions. J. Mach. Learn. Res., 15:2833–2868, 2014.
  4. Convexity and robustness of the Rényi entropy. Mod. Stoch. Theory Appl., 8(3):387–412, 2021.
  5. Mahdi Cheraghchi. Expressions for the entropy of basic discrete distributions. IEEE Trans. Inform. Theory, 65(7):3999–4009, 2019.
  6. DDoS attack detection using fast entropy approach on flow- based network traffic. Procedia Computer Science, 50:30–36, 2015. Big Data, Cloud and Computing Challenges.
  7. Ronald J. Evans and J. Boersma. The entropy of a Poisson distribution (C. Robert Appledorn). SIAM Review, 30(2):314–317, 1988.
  8. Adaptive estimation of Shannon entropy. In 2015 IEEE International Symposium on Information Theory (ISIT), pages 1372–1376, Hong Kong, China, 2015.
  9. Minimax estimation of functionals of discrete distributions. IEEE Trans. Inform. Theory, 61(5):2835–2885, 2015.
  10. Inequalities of Karamata, Schur and Muirhead, and some applications. The Teaching of Mathematics, 14:31–45, 2005.
  11. Characterizing complexity and self-similarity based on fractal and entropy analyses for stock market forecast modelling. Expert Systems with Applications, 144:113098, 2020.
  12. Jovan Karamata. Sur une inégalité relative aux fonctions convexes. Publications de l’Institut mathematique, 1(1):145–147, 1932.
  13. Data streaming algorithms for estimating entropy of network traffic. In Proceedings of the Joint International Conference on Measurement and Modeling of Computer Systems, SIGMETRICS ’06/Performance ’06, pages 145–156, New York, NY, USA, 2006. Association for Computing Machinery.
  14. Properties of various entropies of Gaussian distribution and comparison of entropies of fractional processes. Axioms, 12(11), 2023.
  15. Inequalities: theory of majorization and its applications. Springer Series in Statistics. Springer, New York, second edition, 2011.
  16. Liam Paninski. Estimation of Entropy and Mutual Information. Neural Computation, 15(6):1191–1253, 06 2003.
  17. Alfréd Rényi. On measures of entropy and information. In Proc. 4th Berkeley Sympos. Math. Statist. and Prob., Vol. I: Contributions to the theory of statistics, pages 547–561. Univ. California Press, Berkeley-Los Angeles, Calif., 1960. Held at the Statistical Laboratory, University of California, June 20–July 30, 1960.
  18. Claude Elwood Shannon. A mathematical theory of communication. Bell System Tech. J., 27:379–423, 623–656, 1948.
  19. Keyword extraction using rényi entropy: a statistical and domain independent method. In 2021 7th International Conference on Advanced Computing and Communication Systems (ICACCS), volume 1, pages 1970–1975, 2021.
  20. Minimax rates of entropy estimation on large alphabets via best polynomial approximation. IEEE Trans. Inform. Theory, 62(6):3702–3720, 2016.
Citations (1)

Summary

We haven't generated a summary for this paper yet.