Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Feedback Capacity of the $(1,\infty)$-RLL Input-Constrained Erasure Channel (1503.03359v1)

Published 11 Mar 2015 in cs.IT and math.IT

Abstract: The input-constrained erasure channel with feedback is considered, where the binary input sequence contains no consecutive ones, i.e., it satisfies the $(1,\infty)$-RLL constraint. We derive the capacity for this setting, which can be expressed as $C_{\epsilon}=\max_{0 \leq p \leq \frac{1}{2}}\frac{H_{b}(p)}{p+\frac{1}{1-\epsilon}}$, where $\epsilon$ is the erasure probability and $ H_{b}(\cdot)$ is the binary entropy function. Moreover, we prove that a-priori knowledge of the erasure at the encoder does not increase the feedback capacity. The feedback capacity was calculated using an equivalent dynamic programming (DP) formulation with an optimal average-reward that is equal to the capacity. Furthermore, we obtained an optimal encoding procedure from the solution of the DP, leading to a capacity-achieving, zero-error coding scheme for our setting. DP is thus shown to be a tool not only for solving optimization problems such as capacity calculation, but also for constructing optimal coding schemes. The derived capacity expression also serves as the only non-trivial upper bound known on the capacity of the input-constrained erasure channel without feedback, a problem that is still open.

Summary

We haven't generated a summary for this paper yet.