Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nonlinear Transform Coding (2007.03034v2)

Published 6 Jul 2020 in cs.IT, eess.IV, and math.IT

Abstract: We review a class of methods that can be collected under the name nonlinear transform coding (NTC), which over the past few years have become competitive with the best linear transform codecs for images, and have superseded them in terms of rate--distortion performance under established perceptual quality metrics such as MS-SSIM. We assess the empirical rate--distortion performance of NTC with the help of simple example sources, for which the optimal performance of a vector quantizer is easier to estimate than with natural data sources. To this end, we introduce a novel variant of entropy-constrained vector quantization. We provide an analysis of various forms of stochastic optimization techniques for NTC models; review architectures of transforms based on artificial neural networks, as well as learned entropy models; and provide a direct comparison of a number of methods to parameterize the rate--distortion trade-off of nonlinear transforms, introducing a simplified one.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Johannes Ballé (29 papers)
  2. Philip A. Chou (20 papers)
  3. David Minnen (19 papers)
  4. Saurabh Singh (95 papers)
  5. Nick Johnston (17 papers)
  6. Eirikur Agustsson (27 papers)
  7. Sung Jin Hwang (10 papers)
  8. George Toderici (22 papers)
Citations (186)

Summary

We haven't generated a summary for this paper yet.