Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SPIN: An Empirical Evaluation on Sharing Parameters of Isotropic Networks (2207.10237v1)

Published 21 Jul 2022 in cs.CV

Abstract: Recent isotropic networks, such as ConvMixer and vision transformers, have found significant success across visual recognition tasks, matching or outperforming non-isotropic convolutional neural networks (CNNs). Isotropic architectures are particularly well-suited to cross-layer weight sharing, an effective neural network compression technique. In this paper, we perform an empirical evaluation on methods for sharing parameters in isotropic networks (SPIN). We present a framework to formalize major weight sharing design decisions and perform a comprehensive empirical evaluation of this design space. Guided by our experimental results, we propose a weight sharing strategy to generate a family of models with better overall efficiency, in terms of FLOPs and parameters versus accuracy, compared to traditional scaling methods alone, for example compressing ConvMixer by 1.9x while improving accuracy on ImageNet. Finally, we perform a qualitative study to further understand the behavior of weight sharing in isotropic architectures. The code is available at https://github.com/apple/ml-spin.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Chien-Yu Lin (14 papers)
  2. Anish Prabhu (6 papers)
  3. Thomas Merth (7 papers)
  4. Sachin Mehta (48 papers)
  5. Anurag Ranjan (27 papers)
  6. Maxwell Horton (18 papers)
  7. Mohammad Rastegari (57 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.