Uncertainty relations in terms of generalized entropies derived from information diagrams (2305.18005v1)
Abstract: Entropic uncertainty relations are interesting in their own rights as well as for a lot of applications. Keeping this in mind, we try to make the corresponding inequalities as tight as possible. The use of parametrized entropies also allows one to improve relations between various information measures. Measurements of special types are widely used in quantum information science. For many of them we can estimate the index of coincidence defined as the total sum of squared probabilities. Inequalities between entropies and the index of coincidence form a long-standing direction of researches in classical information theory. The so-called information diagrams provide a powerful tool to obtain inequalities of interest. In the literature, results of such a kind mainly deal with standard information functions linked to the Shannon entropy. At the same time, generalized information functions have found use in questions of quantum information theory. In effect, R\'{e}nyi and Tsallis entropies and related functions are of a separate interest. This paper is devoted to entropic uncertainty relations derived from information diagrams. The obtained inequalities are then applied to mutually unbiased bases, symmetric informationally complete measurements and their generalizations. We also improve entropic uncertainty relations for quantum measurement assigned to an equiangular tight frame.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.