Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

New Techniques for Upper-Bounding the ML Decoding Performance of Binary Linear Codes (1104.1471v2)

Published 8 Apr 2011 in cs.IT and math.IT

Abstract: In this paper, new techniques are presented to either simplify or improve most existing upper bounds on the maximum-likelihood (ML) decoding performance of the binary linear codes over additive white Gaussian noise (AWGN) channels. Firstly, the recently proposed union bound using truncated weight spectrums by Ma {\em et al} is re-derived in a detailed way based on Gallager's first bounding technique (GFBT), where the "good region" is specified by a sub-optimal list decoding algorithm. The error probability caused by the bad region can be upper-bounded by the tail-probability of a binomial distribution, while the error probability caused by the good region can be upper-bounded by most existing techniques. Secondly, we propose two techniques to tighten the union bound on the error probability caused by the good region. The first technique is based on pair-wise error probabilities, which can be further tightened by employing the independence between the error events and certain components of the received random vectors. The second technique is based on triplet-wise error probabilities, which can be upper-bounded by proving that any three bipolar vectors form a non-obtuse triangle. The proposed bounds improve the conventional union bounds but have a similar complexity since they involve only the $Q$-function. The proposed bounds can also be adapted to bit-error probabilities.

Citations (34)

Summary

We haven't generated a summary for this paper yet.