2000 character limit reached
The Capacity of Random Linear Coding Networks as Subspace Channels (1001.1021v2)
Published 7 Jan 2010 in cs.IT and math.IT
Abstract: In this paper, we consider noncoherent random linear coding networks (RLCNs) as a discrete memoryless channel (DMC) whose input and output alphabets consist of subspaces. This contrasts with previous channel models in the literature which assume matrices as the channel input and output. No particular assumptions are made on the network topology or the transfer matrix, except that the latter may be rank-deficient according to some rank deficiency probability distribution. We introduce a random vector basis selection procedure which renders the DMC symmetric. The capacity we derive can be seen as a lower bound on the capacity of noncoherent RLCNs, where subspace coding suffices to achieve this bound.