Papers
Topics
Authors
Recent
2000 character limit reached

Memory-Assisted Universal Source Coding

Published 10 Jan 2012 in cs.IT and math.IT | (1201.2199v1)

Abstract: The problem of the universal compression of a sequence from a library of several small to moderate length sequences from similar context arises in many practical scenarios, such as the compression of the storage data and the Internet traffic. In such scenarios, it is often required to compress and decompress every sequence individually. However, the universal compression of the individual sequences suffers from significant redundancy overhead. In this paper, we aim at answering whether or not having a memory unit in the middle can result in a fundamental gain in the universal compression. We present the problem setup in the most basic scenario consisting of a server node $S$, a relay node $R$ (i.e., the memory unit), and a client node $C$. We assume that server $S$ wishes to send the sequence $xn$ to the client $C$ who has never had any prior communication with the server, and hence, is not capable of memorization of the source context. However, $R$ has previously communicated with $S$ to forward previous sequences from $S$ to the clients other than $C$, and thus, $R$ has memorized a context $ym$ shared with $S$. Note that if the relay node was absent the source could possibly apply universal compression to $xn$ and transmit to $C$ whereas the presence of memorized context at $R$ can possibly reduce the communication overhead in $S$-$R$ link. In this paper, we investigate the fundamental gain of the context memorization in the memory-assisted universal compression of the sequence $xn$ over conventional universal source coding by providing a lower bound on the gain of memory-assisted source coding.

Citations (12)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.