Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multiple-Output Channel Simulation and Lossy Compression of Probability Distributions (2105.01045v2)

Published 3 May 2021 in cs.IT and math.IT

Abstract: We consider a variant of the channel simulation problem with a single input and multiple outputs, where Alice observes a probability distribution $P$ from a set of prescribed probability distributions $\mathbb{\mathcal{P}}$, and sends a prefix-free codeword $W$ to Bob to allow him to generate $n$ i.i.d. random variables $X_{1},X_{2,}...,X_{n}$ which follow the distribution $P$. This can also be regarded as a lossy compression setting for probability distributions. This paper describes encoding schemes for three cases of $P$: $P$ is a distribution over positive integers, $P$ is a continuous distribution over $[0,1]$ with a non-increasing pdf, and $P$ is a continuous distribution over $[0,\infty)$ with a non-increasing pdf. We show that the growth rate of the expected codeword length is sub-linear in $n$ when a power law bound is satisfied. An application of multiple-outputs channel simulation is the compression of probability distributions.

Citations (1)

Summary

We haven't generated a summary for this paper yet.