Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Accurate Entropy Model with Global Reference for Image Compression (2010.08321v3)

Published 16 Oct 2020 in eess.IV and cs.CV

Abstract: In recent deep image compression neural networks, the entropy model plays a critical role in estimating the prior distribution of deep image encodings. Existing methods combine hyperprior with local context in the entropy estimation function. This greatly limits their performance due to the absence of a global vision. In this work, we propose a novel Global Reference Model for image compression to effectively leverage both the local and the global context information, leading to an enhanced compression rate. The proposed method scans decoded latents and then finds the most relevant latent to assist the distribution estimating of the current latent. A by-product of this work is the innovation of a mean-shifting GDN module that further improves the performance. Experimental results demonstrate that the proposed model outperforms the rate-distortion performance of most of the state-of-the-art methods in the industry.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Yichen Qian (10 papers)
  2. Zhiyu Tan (26 papers)
  3. Xiuyu Sun (25 papers)
  4. Ming Lin (65 papers)
  5. Dongyang Li (41 papers)
  6. Zhenhong Sun (12 papers)
  7. Hao Li (803 papers)
  8. Rong Jin (164 papers)
Citations (68)

Summary

We haven't generated a summary for this paper yet.