Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian Quantized Network Coding via Belief Propagation (1209.1679v3)

Published 8 Sep 2012 in cs.IT and math.IT

Abstract: In this paper, we propose an alternative for routing based packet forwarding, which uses network coding to increase transmission efficiency, in terms of both compression and error resilience. This non-adaptive encoding is called quantized network coding, which involves random linear mapping in the real field, followed by quantization to cope with the finite capacity of the links. At the gateway node, which collects received quantized network coder packets, minimum mean squared error decoding is performed, by using belief propagation in the factor graph representation. Our simulation results show a significant improvement, in terms of the number of required packets to recover the messages, which can be interpreted as an embedded distributed source coding for correlated messages.

Citations (3)

Summary

We haven't generated a summary for this paper yet.