Papers
Topics
Authors
Recent
2000 character limit reached

Information without rolling dice (1504.04654v1)

Published 17 Apr 2015 in cs.IT and math.IT

Abstract: The deterministic notions of capacity and entropy are studied in the context of communication and storage of information using square-integrable, bandlimited signals subject to perturbation. The $(\epsilon,\delta)$-capacity, that extends the Kolmogorov $\epsilon$-capacity to packing sets of overlap at most $\delta$, is introduced and compared to the Shannon capacity. The functional form of the results indicates that in both Kolmogorov and Shannon's settings, capacity and entropy grow linearly with the number of degrees of freedom, but only logarithmically with the signal to noise ratio. This basic insight transcends the details of the stochastic or deterministic description of the information-theoretic model. For $\delta=0$, the analysis leads to new bounds on the Kolmogorov $\epsilon$-capacity, and to a tight asymptotic expression of the Kolmogorov $\epsilon$-entropy of bandlimited signals. A deterministic notion of error exponent is introduced. Applications of the theory are briefly discussed.

Citations (16)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.