Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Asymptotic Expansions for Gaussian Channels with Feedback under a Peak Power Constraint (1410.2390v2)

Published 9 Oct 2014 in cs.IT and math.IT

Abstract: This paper investigates the asymptotic expansion for the size of block codes defined for the additive white Gaussian noise (AWGN) channel with feedback under the following setting: A peak power constraint is imposed on every transmitted codeword, and the average error probability of decoding the transmitted message is non-vanishing as the blocklength increases. It is well-known that the presence of feedback does not increase the first-order asymptotics (i.e., capacity) in the asymptotic expansion for the AWGN channel. The main contribution of this paper is a self-contained proof of an upper bound on the asymptotic expansion for the AWGN channel with feedback. Combined with existing achievability results for the AWGN channel, our result implies that the presence of feedback does not improve the second- and third-order asymptotics. An auxiliary contribution is a proof of the strong converse for the parallel Gaussian channels with feedback under a peak power constraint.

Citations (4)

Summary

We haven't generated a summary for this paper yet.