Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Universal Semi-Supervised Domain Adaptation by Mitigating Common-Class Bias (2403.11234v1)

Published 17 Mar 2024 in cs.CV

Abstract: Domain adaptation is a critical task in machine learning that aims to improve model performance on a target domain by leveraging knowledge from a related source domain. In this work, we introduce Universal Semi-Supervised Domain Adaptation (UniSSDA), a practical yet challenging setting where the target domain is partially labeled, and the source and target label space may not strictly match. UniSSDA is at the intersection of Universal Domain Adaptation (UniDA) and Semi-Supervised Domain Adaptation (SSDA): the UniDA setting does not allow for fine-grained categorization of target private classes not represented in the source domain, while SSDA focuses on the restricted closed-set setting where source and target label spaces match exactly. Existing UniDA and SSDA methods are susceptible to common-class bias in UniSSDA settings, where models overfit to data distributions of classes common to both domains at the expense of private classes. We propose a new prior-guided pseudo-label refinement strategy to reduce the reinforcement of common-class bias due to pseudo-labeling, a common label propagation strategy in domain adaptation. We demonstrate the effectiveness of the proposed strategy on benchmark datasets Office-Home, DomainNet, and VisDA. The proposed strategy attains the best performance across UniSSDA adaptation settings and establishes a new baseline for UniSSDA.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (49)
  1. Adamatch: A unified approach to semi-supervised learning and domain adaptation. In ICLR, 2022.
  2. Unified optimal transport framework for universal domain adaptation. In NeurIPS, 2022.
  3. Debiased self-training for semi-supervised learning. In NeurIPS, 2022a.
  4. Mutual nearest neighbor contrast and hybrid prototype self-training for universal domain adaptation. In AAAI, 2022b.
  5. Semi-supervised domain adaptation on manifolds. IEEE Transactions on Neural Networks and Learning Systems, 25(12):2240–2249, 2014.
  6. Frustratingly easy semi-supervised domain adaptation. In Workshop on Domain Adaptation for Natural Language Processing, 2010.
  7. Universal domain adaptation from foundation models. arXiv, 2023.
  8. Semi-supervised domain adaptation with instance constraints. In CVPR, 2013.
  9. An image is worth 16x16 words: Transformers for image recognition at scale. In ICLR, 2021.
  10. Learning to detect open classes for universal domain adaptation. In ECCV, 2020.
  11. Domain-adversarial training of neural networks. Journal of Machine Learning Research, 17:59:1–59:35, 2016.
  12. Semi-supervised domain adaptation via prototype-based multi-level learning. In IJCAI, 2023.
  13. Bidirectional adversarial training for semi-supervised domain adaptation. In IJCAI, 2020.
  14. Distilling and refining domain-specific knowledge for semi-supervised domain adaptation. In BMVC, 2022.
  15. Attract, perturb, and explore: Learning a feature alignment network for semi-supervised domain adaptation. In ECCV, 2020.
  16. Co-regularization based semi-supervised domain adaptation. In NIPS, 2010.
  17. Rethinking distributional matching based domain adaptation. arXiv, 2020.
  18. Learning invariant representations and risks for semi-supervised domain adaptation. In CVPR, 2021a.
  19. Online meta-learning for multi-source and semi-supervised domain adaptation. In ECCV, 2020.
  20. Domain consensus clustering for universal domain adaptation. In CVPR, 2021b.
  21. Cross-domain adaptive clustering for semi-supervised domain adaptation. In CVPR, 2021c.
  22. ECACL: A holistic framework for semi-supervised domain adaptation. In ICCV, 2021d.
  23. Swin transformer: Hierarchical vision transformer using shifted windows. In ICCV, 2021.
  24. Surprisingly simple semi-supervised domain adaptation with pretraining and consistency. BMVC, 2021.
  25. Multi-view collaborative learning for semi-supervised domain adaptation. IEEE Access, 9:166488–166501, 2021.
  26. Review on self-supervised image recognition using deep neural networks. Knowledge-Based Systems, 224:107090, 2021.
  27. Dinov2: Learning robust visual features without supervision. arXiv, 2023.
  28. Dynamic re-weighting for long-tailed semi-supervised learning. In WACV, 2023.
  29. Visda: The visual domain adaptation challenge. arXiv, 2017.
  30. Moment matching for multi-source domain adaptation. In ICCV, 2019.
  31. Learning transferable visual models from natural language supervision. In ICML, 2021.
  32. OVANet: One-vs-all network for universal domain adaptation. In ICCV, 2021.
  33. Semi-supervised domain adaptation via minimax entropy. ICCV, 2019.
  34. Universal domain adaptation through self-supervision. In NeurIPS, 2020.
  35. Ankit Singh. CLDA: Contrastive learning for semi-supervised domain adaptation. NeurIPS, 2021.
  36. Improving semi-supervised domain adaptation using effective target selection and semantics. In CVPRW, 2021.
  37. FixMatch: Simplifying semi-supervised learning with consistency and confidence. In NeurIPS, 2020.
  38. Deep CORAL: Correlation alignment for deep domain adaptation. ECCV Workshops, 2016.
  39. Deep hashing network for unsupervised domain adaptation. CVPR, 2017.
  40. Deep visual domain adaptation: A survey. Neurocomputing, 312:135–153, 2018.
  41. Multi-level consistency learning for semi-supervised domain adaptation. In IJCAI, 2022.
  42. Deep co-training with task decomposition for semi-supervised domain adaptation. ICCV, 2020.
  43. A survey on deep semi-supervised learning. IEEE Transactions on Knowledge and Data Engineering, pages 1–20, 2022.
  44. Semi-supervised domain adaptation with subspace learning for visual recognition. In CVPR, 2015.
  45. Semi-supervised domain adaptation via sample-to-sample self-distillation. In WACV, 2022.
  46. Universal domain adaptation. In CVPR, 2019.
  47. Adversarial multiple source domain adaptation. In NeurIPS, 2018.
  48. On learning invariant representations for domain adaptation. In ICML, 2019.
  49. Universal domain adaptation via compressive attention matching. In ICCV, 2023.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Wenyu Zhang (47 papers)
  2. Qingmu Liu (4 papers)
  3. Felix Ong Wei Cong (1 paper)
  4. Mohamed Ragab (28 papers)
  5. Chuan-Sheng Foo (41 papers)
Youtube Logo Streamline Icon: https://streamlinehq.com