Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PAMS: Quantized Super-Resolution via Parameterized Max Scale (2011.04212v1)

Published 9 Nov 2020 in eess.IV and cs.CV

Abstract: Deep convolutional neural networks (DCNNs) have shown dominant performance in the task of super-resolution (SR). However, their heavy memory cost and computation overhead significantly restrict their practical deployments on resource-limited devices, which mainly arise from the floating-point storage and operations between weights and activations. Although previous endeavors mainly resort to fixed-point operations, quantizing both weights and activations with fixed coding lengths may cause significant performance drop, especially on low bits. Specifically, most state-of-the-art SR models without batch normalization have a large dynamic quantization range, which also serves as another cause of performance drop. To address these two issues, we propose a new quantization scheme termed PArameterized Max Scale (PAMS), which applies the trainable truncated parameter to explore the upper bound of the quantization range adaptively. Finally, a structured knowledge transfer (SKT) loss is introduced to fine-tune the quantized network. Extensive experiments demonstrate that the proposed PAMS scheme can well compress and accelerate the existing SR models such as EDSR and RDN. Notably, 8-bit PAMS-EDSR improves PSNR on Set5 benchmark from 32.095dB to 32.124dB with 2.42$\times$ compression ratio, which achieves a new state-of-the-art.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Huixia Li (16 papers)
  2. Chenqian Yan (9 papers)
  3. Shaohui Lin (45 papers)
  4. Xiawu Zheng (63 papers)
  5. Yuchao Li (24 papers)
  6. Baochang Zhang (113 papers)
  7. Fan Yang (878 papers)
  8. Rongrong Ji (315 papers)
Citations (77)

Summary

We haven't generated a summary for this paper yet.