Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Sampling Continuous-Time AWGN Channels (2002.02716v2)

Published 7 Feb 2020 in cs.IT and math.IT

Abstract: For a continuous-time additive white Gaussian noise (AWGN) channel with possible feedback, it has been shown that as sampling gets infinitesimally fine, the mutual information of the associative discrete-time channels converges to that of the original continuous-time channel. We give in this paper more quantitative strengthenings of this result, which, among other implications, characterize how over-sampling approaches the true mutual information of a continuous-time Gaussian channel with bandwidth limit. The assumptions in our results are relatively mild. In particular, for the non-feedback case, compared to the Shannon-Nyquist sampling theorem, a widely used tool to connect continuous-time Gaussian channels to their discrete-time counterparts that requires the band-limitedness of the channel input, our results only require some integrability conditions on the power spectral density function of the input.

Summary

We haven't generated a summary for this paper yet.