Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Divergence Scaling of Fixed-Length, Binary-Output, One-to-One Distribution Matching (1701.07371v3)

Published 25 Jan 2017 in cs.IT and math.IT

Abstract: Distribution matching is the process of invertibly mapping a uniformly distributed input sequence onto sequences that approximate the output of a desired discrete memoryless source. The special case of a binary output alphabet and one-to-one mapping is studied. A fixed-length distribution matcher is proposed that is optimal in the sense of minimizing the unnormalized informational divergence between its output distribution and a binary memoryless target distribution. Upper and lower bounds on the unnormalized divergence are computed that increase logarithmically in the output block length $n$. It follows that a recently proposed constant composition distribution matcher performs within a constant gap of the minimal achievable informational divergence.

Citations (12)

Summary

We haven't generated a summary for this paper yet.