Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unbiased Teacher for Semi-Supervised Object Detection (2102.09480v1)

Published 18 Feb 2021 in cs.CV and cs.LG
Unbiased Teacher for Semi-Supervised Object Detection

Abstract: Semi-supervised learning, i.e., training networks with both labeled and unlabeled data, has made significant progress recently. However, existing works have primarily focused on image classification tasks and neglected object detection which requires more annotation effort. In this work, we revisit the Semi-Supervised Object Detection (SS-OD) and identify the pseudo-labeling bias issue in SS-OD. To address this, we introduce Unbiased Teacher, a simple yet effective approach that jointly trains a student and a gradually progressing teacher in a mutually-beneficial manner. Together with a class-balance loss to downweight overly confident pseudo-labels, Unbiased Teacher consistently improved state-of-the-art methods by significant margins on COCO-standard, COCO-additional, and VOC datasets. Specifically, Unbiased Teacher achieves 6.8 absolute mAP improvements against state-of-the-art method when using 1% of labeled data on MS-COCO, achieves around 10 mAP improvements against the supervised baseline when using only 0.5, 1, 2% of labeled data on MS-COCO.

Unbiased Teacher for Semi-Supervised Object Detection

The paper under discussion introduces "Unbiased Teacher," a method devised to tackle the specific challenges of Semi-Supervised Object Detection (SS-OD). Unlike image classification, SS-OD has historically required intensive manual annotations, particularly bounding box labels, making the semi-supervised setting highly beneficial yet challenging.

Overview of Methodology

The authors introduce Unbiased Teacher to mitigate pseudo-labeling biases inherent in SS-OD frameworks. The methodology involves a strategic model architecture featuring dual models: a progressively trained Teacher model and a Student model. The Teacher generates pseudo-labels for the Student, and knowledge transfer between these models is facilitated through an exponential moving average (EMA). This interplay aims to address inherent class imbalances and pseudo-label biases informed by overly confident predictions.

Key components of the method include:

  • Pseudo-labeling with Class-Balance Loss: This reduces the effect of overly confident pseudo-labels often skewed towards dominant classes and backgrounds.
  • Teacher-Student Learning Framework: The framework employs the Teacher model to provide pseudo-label supervision while the Student model iteratively updates the Teacher model.
  • Exponential Moving Average (EMA): This mechanism stabilizes the Teacher model by incorporating the knowledge accumulated by successive Student models.

Empirical Results

The method is empirically validated on multiple datasets, including COCO-standard, COCO-additional, and PASCAL VOC. The authors report substantial performance improvements over state-of-the-art methods when using minimal labeled data fractions on MS-COCO. Particularly, Unbiased Teacher achieves up to 10 mAP improvements against the supervised baseline with only up to 5% labeled data on MS-COCO.

Implications and Future Directions

From a practical standpoint, Unbiased Teacher presents a promising advancement in reducing the dependency on large-scale labeled datasets for training robust object detectors. This is crucial for scenarios where acquiring labeled data is costly or impractical.

Theoretically, this work contributes to the expanding discourse on semi-supervised learning, particularly emphasizing the significance of addressing pseudo-labeling biases. By harnessing both labeled and unlabeled data, the research showcases the potential to navigate and exploit the inherent information within unlabeled data effectively.

Speculating future directions, further research could delve into refining the pseudo-label generation process through adaptive mechanisms or leveraging additional forms of weak supervision. Exploration into more diverse data augmentation techniques within this framework could also yield incremental benefits.

Furthermore, expanding this approach to integrate other forms of unsupervised learning, such as contrastive learning strategies, might enhance the robustness and generalization capacities of the Semi-Supervised Object Detection frameworks.

Conclusion

Unbiased Teacher offers a compelling approach to address the challenges in SS-OD, marked by its novel combination of Teacher-Student dynamics and class-balance strategies. The significant numerical improvements on object detection tasks underscore its efficacy and highlight the potential to reshape practices involving deep neural network training with limited labeled data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Yen-Cheng Liu (26 papers)
  2. Chih-Yao Ma (27 papers)
  3. Zijian He (31 papers)
  4. Chia-Wen Kuo (14 papers)
  5. Kan Chen (74 papers)
  6. Peizhao Zhang (40 papers)
  7. Bichen Wu (52 papers)
  8. Zsolt Kira (110 papers)
  9. Peter Vajda (52 papers)
Citations (436)
Youtube Logo Streamline Icon: https://streamlinehq.com