Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Compression Objective and a Cycle Loss for Neural Image Compression (1905.10371v1)

Published 24 May 2019 in eess.IV, cs.LG, and stat.ML

Abstract: In this manuscript we propose two objective terms for neural image compression: a compression objective and a cycle loss. These terms are applied on the encoder output of an autoencoder and are used in combination with reconstruction losses. The compression objective encourages sparsity and low entropy in the activations. The cycle loss term represents the distortion between encoder outputs computed from the original image and from the reconstructed image (code-domain distortion). We train different autoencoders by using the compression objective in combination with different losses: a) MSE, b) MSE and MSSSIM, c) MSE, MS-SSIM and cycle loss. We observe that images encoded by these differently-trained autoencoders fall into different points of the perception-distortion curve (while having similar bit-rates). In particular, MSE-only training favors low image-domain distortion, whereas cycle loss training favors high perceptual quality.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Caglar Aytekin (13 papers)
  2. Francesco Cricri (22 papers)
  3. Antti Hallapuro (3 papers)
  4. Jani Lainema (7 papers)
  5. Emre Aksu (16 papers)
  6. Miska Hannuksela (7 papers)
Citations (9)