Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Mathematical Theory of Semantic Communication (2401.13387v2)

Published 24 Jan 2024 in cs.IT and math.IT

Abstract: The year 1948 witnessed the historic moment of the birth of classic information theory (CIT). Guided by CIT, modern communication techniques have approached the theoretic limitations, such as, entropy function $H(U)$, channel capacity $C=\max_{p(x)}I(X;Y)$ and rate-distortion function $R(D)=\min_{p(\hat{x}|x):\mathbb{E}d(x,\hat{x})\leq D} I(X;\hat{X})$. Semantic communication paves a new direction for future communication techniques whereas the guided theory is missed. In this paper, we try to establish a systematic framework of semantic information theory (SIT). We investigate the behavior of semantic communication and find that synonym is the basic feature so we define the synonymous mapping between semantic information and syntactic information. Stemming from this core concept, synonymous mapping $f$, we introduce the measures of semantic information, such as semantic entropy $H_s(\tilde{U})$, up/down semantic mutual information $Is(\tilde{X};\tilde{Y})$ $(I_s(\tilde{X};\tilde{Y}))$, semantic capacity $C_s=\max_{f_{xy}}\max_{p(x)}Is(\tilde{X};\tilde{Y})$, and semantic rate-distortion function $R_s(D)=\min_{{f_x,f_{\hat{x}}}}\min_{p(\hat{x}|x):\mathbb{E}d_s(\tilde{x},\hat{\tilde{x}})\leq D}I_s(\tilde{X};\hat{\tilde{X}})$. Furthermore, we prove three coding theorems of SIT by using random coding and (jointly) typical decoding/encoding, that is, the semantic source coding theorem, semantic channel coding theorem, and semantic rate-distortion coding theorem. We find that the limits of SIT are extended by using synonymous mapping, that is, $H_s(\tilde{U})\leq H(U)$, $C_s\geq C$ and $R_s(D)\leq R(D)$. All these works composite the basis of semantic information theory. In addition, we discuss the semantic information measures in the continuous case. For the band-limited Gaussian channel, we obtain a new channel capacity formula, $C_s=B\log\left[S4\left(1+\frac{P}{N_0B}\right)\right]$.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (12)
  1. C. E. Shannon, “A Mathematical Theory of Communication,” The Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, Oct., 1948.
  2. W. Weaver, “Recent Contributions to the Mathematical Theory of Communication,” ETC: A Review of General Semantics, pp. 261-81, 1953.
  3. R. Carnap, Y. Bar-Hillel, “An Outline of A Theory of Semantic Information,” RLE Technical Reports 247, Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge MA,1952.
  4. L. Floridi, “Outline of A Theory of Strongly Semantic Information,” Minds and machines, vol. 14, no. 2, pp. 197-221, 2004.
  5. N. J. Nilsson, “Probabilistic Logic,” Artificial Intelligence, vol. 28, no. 1, pp. 71-87, 1986.
  6. A. D. Luca, S. Termini, “A Definition of A Non-probabilistic Entropy In The Setting of Fuzzy Sets,” Information and Control, vol. 20, pp. 301-312, 1972.
  7. A. D. Luca, S. Termini, “Entropy of L-Fuzzy Sets,” Information and Control, vol. 24, pp. 55-73, 1974.
  8. W. Wu, “General Source and General Entropy,” Journal of Beijing University of Posts and Telecommunications, vol. 5, no. 1, pp. 29-41, 1982.
  9. J. Liu, W. Zhang, and H. V. Poor, “A Rate-Distortion Framework for Characterizing Semantic Information,” 2021 IEEE International Symposium on Information Theory (ISIT), Melbourne, Australia,2021.
  10. Y. Shao, Q. Cao, and D. Gunduz, “A Theory of Semantic Communication,” arXiv preprint arXiv:2212.01485, 2022.
  11. J. Tang, Q. Yang, and Z. Zhang, “Information-Theoretic Limits on Compression of Semantic Information,” arXiv preprint arXiv:2306.02305, 2023.
  12. E. Arıkan, “Channel polarization: a method for constructing capacity achieving codes for symmetric binary-input memoryless channels,” IEEE Trans. Inf. Theory, vol. 55, no. 7, pp. 3051-3073, July 2009.

Summary

We haven't generated a summary for this paper yet.