Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exploiting context dependence for image compression with upsampling (2004.03391v3)

Published 6 Apr 2020 in eess.IV, cs.LG, cs.MM, and stat.ML

Abstract: Image compression with upsampling encodes information to succeedingly increase image resolution, for example by encoding differences in FUIF and JPEG XL. It is useful for progressive decoding, also often can improve compression ratio - both for lossless compression and e.g. DC coefficients of lossy. However, the currently used solutions rather do not exploit context dependence for encoding of such upscaling information. This article discusses simple inexpensive general techniques for this purpose, which allowed to save on average $0.645$ bits/difference (between $0.138$ and $1.489$) for the last upscaling for 48 standard $512\times 512$ grayscale 8 bit images - compared to assumption of fixed Laplace distribution. Using least squares linear regression of context to predict center of Laplace distribution gave on average $0.393$ bits/difference savings. The remaining savings were obtained by additionally predicting width of this Laplace distribution, also using just the least squares linear regression. For RGB images, optimization of color transform alone gave mean $\approx 4.6\%$ size reduction comparing to standard YCrCb if using fixed transform, $\approx 6.3\%$ if optimizing transform individually for each image. Then further mean $\approx 10\%$ reduction was obtained if predicting Laplace parameters based on context. The presented simple inexpensive general methodology can be also used for different types of data like DCT coefficients in lossy image compression.

Citations (1)

Summary

We haven't generated a summary for this paper yet.