Belief-Propagation Decoding of Lattices Using Gaussian Mixtures (0904.4741v1)
Abstract: A belief-propagation decoder for low-density lattice codes is given which represents messages explicitly as a mixture of Gaussians functions. The key component is an algorithm for approximating a mixture of several Gaussians with another mixture with a smaller number of Gaussians. This Gaussian mixture reduction algorithm iteratively reduces the number of Gaussians by minimizing the distance between the original mixture and an approximation with one fewer Gaussians. Error rates and noise thresholds of this decoder are compared with those for the previously-proposed decoder which discretely quantizes the messages. The error rates are indistinguishable for dimension 1000 and 10000 lattices, and the Gaussian-mixture decoder has a 0.2 dB loss for dimension 100 lattices. The Gaussian-mixture decoder has a loss of about 0.03 dB in the noise threshold, which is evaluated via Monte Carlo density evolution. Further, the Gaussian-mixture decoder uses far less storage for the messages.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.