Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Vector Network Coding Based on Subspace Codes Outperforms Scalar Linear Network Coding (1512.06352v5)

Published 20 Dec 2015 in cs.IT and math.IT

Abstract: This paper considers vector network coding solutions based on rank-metric codes and subspace codes. The main result of this paper is that vector solutions can significantly reduce the required alphabet size compared to the optimal scalar linear solution for the same multicast network. The multicast networks considered in this paper have one source with $h$ messages, and the vector solution is over a field of size $q$ with vectors of length~$t$. For a given network, let the smallest field size for which the network has a scalar linear solution be $q_s$, then the gap in the alphabet size between the vector solution and the scalar linear solution is defined to be $q_s-qt$. In this contribution, the achieved gap is $q{(h-2)t2/h + o(t)}$ for any $q \geq 2$ and any even $h \geq 4$. If $h \geq 5$ is odd, then the achieved gap of the alphabet size is $q{(h-3)t2/(h-1) + o(t)}$. Previously, only a gap of size size one had been shown for networks with a very large number of messages. These results imply the same gap of the alphabet size between the optimal scalar linear and some scalar nonlinear network coding solution for multicast networks. For three messages, we also show an advantage of vector network coding, while for two messages the problem remains open. Several networks are considered, all of them are generalizations and modifications of the well-known combination networks. The vector network codes that are used as solutions for those networks are based on subspace codes, particularly subspace codes obtained from rank-metric codes. Some of these codes form a new family of subspace codes, which poses a new research problem.

Summary

We haven't generated a summary for this paper yet.