Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Markov Chain Model for the Decoding Probability of Sparse Network Coding (1607.05037v3)

Published 18 Jul 2016 in cs.NI

Abstract: Random Linear Network Coding (RLNC) has been proved to offer an efficient communication scheme, leveraging an interesting robustness against packet losses. However, it suffers from a high computational complexity and some novel approaches, which follow the same idea, have been recently proposed. One of such solutions is Tunable Sparse Network Coding (TSNC), where only few packets are combined in each transmissions. The amount of data packets to be combined in each transmissions can be set from a density parameter/distribution, which could be eventually adapted. In this work we present an analytical model that captures the performance of SNC on an accurate way. We exploit an absorbing Markov process where the states are defined by the number of useful packets received by the decoder, i.e the decoding matrix rank, and the number of non-zero columns at such matrix. The model is validated by means of a thorough simulation campaign, and the difference between model and simulation is negligible. A mean square error less than $4 \cdot 10{-4}$ in the worst cases. We also include in the comparison some of more general bounds that have been recently used, showing that their accuracy is rather poor. The proposed model would enable a more precise assessment of the behavior of sparse network coding techniques. The last results show that the proposed analytical model can be exploited by the TSNC techniques in order to select by the encoder the best density as the transmission evolves.

Citations (38)

Summary

We haven't generated a summary for this paper yet.