Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lossless Image Compression Using a Multi-Scale Progressive Statistical Model (2108.10551v1)

Published 24 Aug 2021 in eess.IV, cs.CV, and cs.LG

Abstract: Lossless image compression is an important technique for image storage and transmission when information loss is not allowed. With the fast development of deep learning techniques, deep neural networks have been used in this field to achieve a higher compression rate. Methods based on pixel-wise autoregressive statistical models have shown good performance. However, the sequential processing way prevents these methods to be used in practice. Recently, multi-scale autoregressive models have been proposed to address this limitation. Multi-scale approaches can use parallel computing systems efficiently and build practical systems. Nevertheless, these approaches sacrifice compression performance in exchange for speed. In this paper, we propose a multi-scale progressive statistical model that takes advantage of the pixel-wise approach and the multi-scale approach. We developed a flexible mechanism where the processing order of the pixels can be adjusted easily. Our proposed method outperforms the state-of-the-art lossless image compression methods on two large benchmark datasets by a significant margin without degrading the inference speed dramatically.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Honglei Zhang (32 papers)
  2. Francesco Cricri (22 papers)
  3. Hamed R. Tavakoli (22 papers)
  4. Nannan Zou (4 papers)
  5. Emre Aksu (16 papers)
  6. Miska M. Hannuksela (6 papers)
Citations (17)

Summary

We haven't generated a summary for this paper yet.