Lossy Source Compression of Non-Uniform Binary Sources Using GQ-LDGM Codes
Abstract: In this paper, we study the use of GF(q)-quantized LDGM codes for binary source coding. By employing quantization, it is possible to obtain binary codewords with a non-uniform distribution. The obtained statistics is hence suitable for optimal, direct quantization of non-uniform Bernoulli sources. We employ a message-passing algorithm combined with a decimation procedure in order to perform compression. The experimental results based on GF(q)-LDGM codes with regular degree distributions yield performances quite close to the theoretical rate-distortion bounds.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.