Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mathematical Model of Quantum Channel Capacity (2302.04873v2)

Published 9 Feb 2023 in cs.IT, eess.SP, and math.IT

Abstract: In this article, we are proposing a closed-form solution for the capacity of the single quantum channel. The Gaussian distributed input has been considered for the analytical calculation of the capacity. In our previous couple of papers, we invoked models for joint quantum noise and the corresponding received signals; in this current research, we proved that these models are Gaussian mixtures distributions. In this paper, we showed how to deal with both of cases, namely (I)the Gaussian mixtures distribution for scalar variables and (II) the Gaussian mixtures distribution for random vectors. Our target is to calculate the entropy of the joint noise and the entropy of the received signal in order to calculate the capacity expression of the quantum channel. The main challenge is to work with the function type of the Gaussian mixture distribution. The entropy of the Gaussian mixture distributions cannot be calculated in the closed-form solution due to the logarithm of a sum of exponential functions. As a solution, we proposed a lower bound and a upper bound for each of the entropies of joint noise and the received signal, and finally upper inequality and lower inequality lead to the upper bound for the mutual information and hence the maximum achievable data rate as the capacity. In this paper reader will able to visualize an closed-form capacity experssion which make this paper distinct from our previous works. These capacity experssion and coresses ponding bounds are calculated for both the cases: the Gaussian mixtures distribution for scalar variables and the Gaussian mixtures distribution for random vectors as well.

Summary

We haven't generated a summary for this paper yet.