Fundamental Limitation of Semantic Communications: Neural Estimation for Rate-Distortion (2401.01176v1)
Abstract: This paper studies the fundamental limit of semantic communications over the discrete memoryless channel. We consider the scenario to send a semantic source consisting of an observation state and its corresponding semantic state, both of which are recovered at the receiver. To derive the performance limitation, we adopt the semantic rate-distortion function (SRDF) to study the relationship among the minimum compression rate, observation distortion, semantic distortion, and channel capacity. For the case with unknown semantic source distribution, while only a set of the source samples is available, we propose a neural-network-based method by leveraging the generative networks to learn the semantic source distribution. Furthermore, for a special case where the semantic state is a deterministic function of the observation, we design a cascade neural network to estimate the SRDF. For the case with perfectly known semantic source distribution, we propose a general Blahut-Arimoto algorithm to effectively compute the SRDF. Finally, experimental results validate our proposed algorithms for the scenarios with ideal Gaussian semantic source and some practical datasets.
- CARNAP R, BAR-HILLEL Y. An outline of a theory of semantic information[J]. 1952.
- GALLAGER R G. Information theory and reliable communication[M]. New York: Wiley, 1968.
- COVER T M. Elements of information theory[M]. John Wiley & Sons, 1999.
- YEUNG R W. Information theory and network coding[M]. Springer Science & Business Media, 2008.