On the Capacity of a Class of Signal-Dependent Noise Channels (1702.03590v2)
Abstract: In some applications, the variance of additive measurement noise depends on the signal that we aim to measure. For instance, additive Gaussian signal-dependent noise (AGSDN) channel models are used in molecular and optical communication. Herein we provide lower and upper bounds on the capacity of additive signal-dependent noise (ASDN) channels. The idea of the first lower bound is the extension of the majorization inequality, and for the second one, it uses some calculations based on the fact that $h(Y) > h (Y|Z)$. Both of them are valid for all additive signal-dependent noise (ASDN) channels defined in the paper. The upper bound is based on a previous idea of the authors ("symmetric relative entropy") and is used for the additive Gaussian signal-dependent noise (AGSDN) channels. These bounds indicate that in ASDN channels (unlike the classical AWGN channels), the capacity does not necessarily become larger by making the variance function of the noise smaller. We also provide sufficient conditions under which the capacity becomes infinity. This is complemented by a number of conditions that imply capacity is finite and a unique capacity achieving measure exists (in the sense of the output measure).
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.