Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 102 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 30 tok/s
GPT-5 High 27 tok/s Pro
GPT-4o 110 tok/s
GPT OSS 120B 475 tok/s Pro
Kimi K2 203 tok/s Pro
2000 character limit reached

Mathematics of Sparsity and Entropy: Axioms, Core Functions and Sparse Recovery (1501.05126v3)

Published 21 Jan 2015 in cs.IT and math.IT

Abstract: Sparsity and entropy are pillar notions of modern theories in signal processing and information theory. However, there is no clear consensus among scientists on the characterization of these notions. Previous efforts have contributed to understand individually sparsity or entropy from specific research interests. This paper proposes a mathematical formalism, a joint axiomatic characterization, which contributes to comprehend (the beauty of) sparsity and entropy. The paper gathers and introduces inherent and first principles criteria as axioms and attributes that jointly characterize sparsity and entropy. The proposed set of axioms is constructive and allows to derive simple or \emph{core functions} and further generalizations. Core sparsity generalizes the Hoyer measure, Gini index and $pq$-means. Core entropy generalizes the R\'{e}nyi entropy and Tsallis entropy, both of which generalize Shannon entropy. Finally, core functions are successfully applied to compressed sensing and to minimum entropy given sample moments. More importantly, the (simplest) core sparsity adds theoretical support to the $\ell_1$-minimization approach in compressed sensing.

Citations (21)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.