Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

TinyCLIP: CLIP Distillation via Affinity Mimicking and Weight Inheritance (2309.12314v1)

Published 21 Sep 2023 in cs.CV

Abstract: In this paper, we propose a novel cross-modal distillation method, called TinyCLIP, for large-scale language-image pre-trained models. The method introduces two core techniques: affinity mimicking and weight inheritance. Affinity mimicking explores the interaction between modalities during distillation, enabling student models to mimic teachers' behavior of learning cross-modal feature alignment in a visual-linguistic affinity space. Weight inheritance transmits the pre-trained weights from the teacher models to their student counterparts to improve distillation efficiency. Moreover, we extend the method into a multi-stage progressive distillation to mitigate the loss of informative weights during extreme compression. Comprehensive experiments demonstrate the efficacy of TinyCLIP, showing that it can reduce the size of the pre-trained CLIP ViT-B/32 by 50%, while maintaining comparable zero-shot performance. While aiming for comparable performance, distillation with weight inheritance can speed up the training by 1.4 - 7.8 $\times$ compared to training from scratch. Moreover, our TinyCLIP ViT-8M/16, trained on YFCC-15M, achieves an impressive zero-shot top-1 accuracy of 41.1% on ImageNet, surpassing the original CLIP ViT-B/16 by 3.5% while utilizing only 8.9% parameters. Finally, we demonstrate the good transferability of TinyCLIP in various downstream tasks. Code and models will be open-sourced at https://aka.ms/tinyclip.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (13)
  1. Kan Wu (42 papers)
  2. Houwen Peng (36 papers)
  3. Zhenghong Zhou (6 papers)
  4. Bin Xiao (93 papers)
  5. Mengchen Liu (48 papers)
  6. Lu Yuan (130 papers)
  7. Hong Xuan (9 papers)
  8. Michael Valenzuela (1 paper)
  9. Xi (2 papers)
  10. Chen (63 papers)
  11. Xinggang Wang (163 papers)
  12. Hongyang Chao (34 papers)
  13. Han Hu (196 papers)
Citations (37)