Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Easily Computed Lower Bounds on the Information Rate of Intersymbol Interference Channels (1110.0560v1)

Published 4 Oct 2011 in cs.IT and math.IT

Abstract: Provable lower bounds are presented for the information rate I(X; X+S+N) where X is the symbol drawn independently and uniformly from a finite-size alphabet, S is a discrete-valued random variable (RV) and N is a Gaussian RV. It is well known that with S representing the precursor intersymbol interference (ISI) at the decision feedback equalizer (DFE) output, I(X; X+S+N) serves as a tight lower bound for the symmetric information rate (SIR) as well as capacity of the ISI channel corrupted by Gaussian noise. When evaluated on a number of well-known finite-ISI channels, these new bounds provide a very similar level of tightness against the SIR to the conjectured lower bound by Shamai and Laroia at all signal-to-noise ratio (SNR) ranges, while being actually tighter when viewed closed up at high SNRs. The new lower bounds are obtained in two steps: First, a "mismatched" mutual information function is introduced which can be proved as a lower bound to I(X; X+S+N). Secondly, this function is further bounded from below by an expression that can be computed easily via a few single-dimensional integrations with a small computational load.

Citations (9)

Summary

We haven't generated a summary for this paper yet.