Papers
Topics
Authors
Recent
2000 character limit reached

Fast Approximate Spectral Normalization for Robust Deep Neural Networks

Published 22 Mar 2021 in cs.LG | (2103.13815v1)

Abstract: Deep neural networks (DNNs) play an important role in machine learning due to its outstanding performance compared to other alternatives. However, DNNs are not suitable for safety-critical applications since DNNs can be easily fooled by well-crafted adversarial examples. One promising strategy to counter adversarial attacks is to utilize spectral normalization, which ensures that the trained model has low sensitivity towards the disturbance of input samples. Unfortunately, this strategy requires exact computation of spectral norm, which is computation intensive and impractical for large-scale networks. In this paper, we introduce an approximate algorithm for spectral normalization based on Fourier transform and layer separation. The primary contribution of our work is to effectively combine the sparsity of weight matrix and decomposability of convolution layers. Extensive experimental evaluation demonstrates that our framework is able to significantly improve both time efficiency (up to 60\%) and model robustness (61\% on average) compared with the state-of-the-art spectral normalization.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.