Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Entropy Conserving Binarization Scheme for Video and Image Compression (1408.3083v1)

Published 13 Aug 2014 in cs.IT, cs.MM, and math.IT

Abstract: The paper presents a binarization scheme that converts non-binary data into a set of binary strings. At present, there are many binarization algorithms, but they are optimal for only specific probability distributions of the data source. Overcoming the problem, it is shown in this paper that the presented binarization scheme conserves the entropy of the original data having any probability distribution of $m$-ary source. The major advantages of this scheme are that it conserves entropy without the knowledge of the source and the probability distribution of the source symbols. The scheme has linear complexity in terms of the length of the input data. The binarization scheme can be implemented in Context-based Adaptive Binary Arithmetic Coding (CABAC) for video and image compression. It can also be utilized by various universal data compression algorithms that have high complexity in compressing non-binary data, and by binary data compression algorithms to optimally compress non-binary data.

Summary

We haven't generated a summary for this paper yet.