Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

AdaDM: Enabling Normalization for Image Super-Resolution (2111.13905v1)

Published 27 Nov 2021 in eess.IV, cs.CV, and cs.LG

Abstract: Normalization like Batch Normalization (BN) is a milestone technique to normalize the distributions of intermediate layers in deep learning, enabling faster training and better generalization accuracy. However, in fidelity image Super-Resolution (SR), it is believed that normalization layers get rid of range flexibility by normalizing the features and they are simply removed from modern SR networks. In this paper, we study this phenomenon quantitatively and qualitatively. We found that the standard deviation of the residual feature shrinks a lot after normalization layers, which causes the performance degradation in SR networks. Standard deviation reflects the amount of variation of pixel values. When the variation becomes smaller, the edges will become less discriminative for the network to resolve. To address this problem, we propose an Adaptive Deviation Modulator (AdaDM), in which a modulation factor is adaptively predicted to amplify the pixel deviation. For better generalization performance, we apply BN in state-of-the-art SR networks with the proposed AdaDM. Meanwhile, the deviation amplification strategy in AdaDM makes the edge information in the feature more distinguishable. As a consequence, SR networks with BN and our AdaDM can get substantial performance improvements on benchmark datasets. Extensive experiments have been conducted to show the effectiveness of our method.

Citations (11)

Summary

We haven't generated a summary for this paper yet.