Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Identification over Additive Noise Channels in the Presence of Feedback (2102.01198v4)

Published 1 Feb 2021 in cs.IT and math.IT

Abstract: We analyze deterministic message identification via channels with non-discrete additive white noise and with a noiseless feedback link under both average power and peak power constraints. The identification task is part of Post Shannon Theory. The consideration of communication systems beyond Shannon's approach is useful in order to increase the efficiency of information transmission for certain applications. We propose a coding scheme that first generates infinite common randomness between the sender and the receiver. If the channel has a positive message transmission feedback capacity, for given error thresholds and sufficiently large blocklength this common randomness is then used to construct arbitrarily large deterministic identification codes. In particular, the deterministic identification feedback capacity is infinite regardless of the scaling (exponential, doubly exponential, etc.) chosen for the capacity definition. Clearly, if randomized encoding is allowed in addition to the use of feedback, these results continue to hold.

Citations (15)

Summary

We haven't generated a summary for this paper yet.