Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Thousand to One: Semantic Prior Modeling for Conceptual Coding (2103.07131v2)

Published 12 Mar 2021 in cs.CV and eess.IV

Abstract: Conceptual coding has been an emerging research topic recently, which encodes natural images into disentangled conceptual representations for compression. However, the compression performance of the existing methods is still sub-optimal due to the lack of comprehensive consideration of rate constraint and reconstruction quality. To this end, we propose a novel end-to-end semantic prior modeling-based conceptual coding scheme towards extremely low bitrate image compression, which leverages semantic-wise deep representations as a unified prior for entropy estimation and texture synthesis. Specifically, we employ semantic segmentation maps as structural guidance for extracting deep semantic prior, which provides fine-grained texture distribution modeling for better detail construction and higher flexibility in subsequent high-level vision tasks. Moreover, a cross-channel entropy model is proposed to further exploit the inter-channel correlation of the spatially independent semantic prior, leading to more accurate entropy estimation for rate-constrained training. The proposed scheme achieves an ultra-high 1000x compression ratio, while still enjoying high visual reconstruction quality and versatility towards visual processing and analysis tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Jianhui Chang (5 papers)
  2. Zhenghui Zhao (6 papers)
  3. Lingbo Yang (9 papers)
  4. Chuanmin Jia (24 papers)
  5. Jian Zhang (543 papers)
  6. Siwei Ma (86 papers)
Citations (20)

Summary

We haven't generated a summary for this paper yet.