Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Universal Compression with Side Information from a Correlated Source (1901.03625v1)

Published 11 Jan 2019 in cs.IT and math.IT

Abstract: Packets originated from an information source in the network can be highly correlated. These packets are often routed through different paths, and compressing them requires to process them individually. Traditional universal compression solutions would not perform well over a single packet because of the limited data available for learning the unknown source parameters. In this paper, we define a notion of correlation between information sources and characterize the average redundancy in universal compression with side information from a correlated source. We define the side information gain as the ratio between the average maximin redundancy of universal compression without side information to that with side information. We derive a lower bound on the side information gain, where we show that the presence of side information provides at least 50% traffic reduction over traditional universal compression when applied to network packet data confirming previous empirical studies.

Summary

We haven't generated a summary for this paper yet.