Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Localization Distillation for Dense Object Detection (2102.12252v4)

Published 24 Feb 2021 in cs.CV

Abstract: Knowledge distillation (KD) has witnessed its powerful capability in learning compact models in object detection. Previous KD methods for object detection mostly focus on imitating deep features within the imitation regions instead of mimicking classification logit due to its inefficiency in distilling localization information and trivial improvement. In this paper, by reformulating the knowledge distillation process on localization, we present a novel localization distillation (LD) method which can efficiently transfer the localization knowledge from the teacher to the student. Moreover, we also heuristically introduce the concept of valuable localization region that can aid to selectively distill the semantic and localization knowledge for a certain region. Combining these two new components, for the first time, we show that logit mimicking can outperform feature imitation and localization knowledge distillation is more important and efficient than semantic knowledge for distilling object detectors. Our distillation scheme is simple as well as effective and can be easily applied to different dense object detectors. Experiments show that our LD can boost the AP score of GFocal-ResNet-50 with a single-scale 1x training schedule from 40.1 to 42.1 on the COCO benchmark without any sacrifice on the inference speed. Our source code and trained models are publicly available at https://github.com/HikariTJU/LD

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Zhaohui Zheng (12 papers)
  2. Rongguang Ye (16 papers)
  3. Ping Wang (289 papers)
  4. Dongwei Ren (31 papers)
  5. Wangmeng Zuo (279 papers)
  6. Qibin Hou (82 papers)
  7. Ming-Ming Cheng (185 papers)
Citations (95)

Summary

We haven't generated a summary for this paper yet.