Polar Codes: Speed of polarization and polynomial gap to capacity (1304.4321v2)
Abstract: We prove that, for all binary-input symmetric memoryless channels, polar codes enable reliable communication at rates within $\epsilon > 0$ of the Shannon capacity with a block length, construction complexity, and decoding complexity all bounded by a {\em polynomial} in $1/\epsilon$. Polar coding gives the {\em first known explicit construction} with rigorous proofs of all these properties; previous constructions were not known to achieve capacity with less than $\exp(1/\epsilon)$ decoding complexity except for erasure channels. We establish the capacity-achieving property of polar codes via a direct analysis of the underlying martingale of conditional entropies, without relying on the martingale convergence theorem. This step gives rough polarization (noise levels $\approx \epsilon$ for the "good" channels), which can then be adequately amplified by tracking the decay of the channel Bhattacharyya parameters. Our effective bounds imply that polar codes can have block length (and encoding/decoding complexity) bounded by a polynomial in $1/\epsilon$. The generator matrix of such polar codes can be constructed in polynomial time by algorithmically computing an adequate approximation of the polarization process.