LEAD: Learning Decomposition for Source-free Universal Domain Adaptation (2403.03421v1)
Abstract: Universal Domain Adaptation (UniDA) targets knowledge transfer in the presence of both covariate and label shifts. Recently, Source-free Universal Domain Adaptation (SF-UniDA) has emerged to achieve UniDA without access to source data, which tends to be more practical due to data protection policies. The main challenge lies in determining whether covariate-shifted samples belong to target-private unknown categories. Existing methods tackle this either through hand-crafted thresholding or by developing time-consuming iterative clustering strategies. In this paper, we propose a new idea of LEArning Decomposition (LEAD), which decouples features into source-known and -unknown components to identify target-private data. Technically, LEAD initially leverages the orthogonal decomposition analysis for feature decomposition. Then, LEAD builds instance-level decision boundaries to adaptively identify target-private data. Extensive experiments across various UniDA scenarios have demonstrated the effectiveness and superiority of LEAD. Notably, in the OPDA scenario on VisDA dataset, LEAD outperforms GLC by 3.5% overall H-score and reduces 75% time to derive pseudo-labeling decision boundaries. Besides, LEAD is also appealing in that it is complementary to most existing methods. The code is available at https://github.com/ispc-lab/LEAD.
- Recognition in terra incognita. In ECCV, 2018.
- Richard Bellman. Dynamic programming. Science, 153(3731):34–37, 1966.
- Partial transfer learning with selective adversarial networks. In CVPR, 2018.
- Learning to transfer examples for partial domain adaptation. In CVPR, 2019.
- Unified optimal transport framework for universal domain adaptation. In NeurIPS, 2022.
- Learning to balance specificity and invariance for in and out of domain generalization. In ECCV. Springer, 2020.
- Geometric anchor correspondence mining with uncertainty modeling for universal domain adaptation. In CVPR, 2022.
- Domain adaptive faster r-cnn for object detection in the wild. In CVPR, 2018.
- No more discrimination: Cross city adaptation of road scene segmenters. In ICCV, 2017.
- The cityscapes dataset for semantic urban scene understanding. In CVPR, 2016.
- Optimal transport for domain adaptation. IEEE TPAMI, 39(9):1853–1865, 2016.
- An image is worth 16x16 words: Transformers for image recognition at scale. In ICLR, 2020.
- Unsupervised cross-modality domain adaptation of convnets for biomedical image segmentations with adversarial loss. In IJCAI, 2018.
- Open-set hypothesis transfer with semantic consistency. IEEE TIP, 30:6473–6484, 2021.
- Learning to detect open classes for universal domain adaptation. In ECCV. Springer, 2020.
- Domain-adversarial training of neural networks. JMLR, pages 2096–2030, 2016.
- Imagenet-trained cnns are biased towards texture; increasing shape bias improves accuracy and robustness. In ICLR, 2019.
- Deep residual learning for image recognition. In CVPR, 2016.
- A baseline for detecting misclassified and out-of-distribution examples in neural networks. In ICLR, 2017.
- Cycada: Cycle-consistent adversarial domain adaptation. In ICML, 2018a.
- Cycada: Cycle-consistent adversarial domain adaptation. In ICML. PMLR, 2018b.
- Progressive domain adaptation for object detection. In WACV, 2020a.
- Generalized odin: Detecting out-of-distribution image without learning from out-of-distribution data. In CVPR, 2020b.
- Training ood detectors in their natural habitats. In ICML, 2022.
- Segment anything. In ICCV, 2023.
- A review of domain adaptation without target labels. IEEE TPAMI, pages 766–785, 2019.
- Universal source-free domain adaptation. In CVPR, 2020.
- Domain consensus clustering for universal domain adaptation. In CVPR, 2021.
- Model adaptation: Unsupervised domain adaptation without source data. In CVPR, 2020.
- Domain invariant and class discriminative feature learning for visual domain adaptation. IEEE TIP, 2018.
- Do we really need to access the source data? source hypothesis transfer for unsupervised domain adaptation. In ICML. PMLR, 2020.
- Umad: Universal model adaptation under domain and category shift. arXiv preprint arXiv:2112.08553, 2021.
- Deja vu: Continual model generalization for unseen domains. In ICLR, 2023a.
- Decompose to adapt: Cross-domain object detection via feature disentanglement. IEEE TMM, 2022a.
- A source-free domain adaptive polyp detection framework with style diversification flow. IEEE TMI, pages 1897–1908, 2022.
- Coca: Classifier-oriented calibration for source-free universal domain adaptation via textual prototype. arXiv preprint arXiv:2308.10450, 2023b.
- Psdc: A prototype-based shared-dummy classifier model for open-set domain adaptation. IEEE Transactions on Cybernetics, 2022b.
- D2ifln: Disentangled domain-invariant feature learning networks for domain generalization. IEEE Transactions on Cognitive and Developmental Systems, 2023c.
- Transfer feature learning with joint distribution adaptation. In ICCV, 2013.
- Poem: Out-of-distribution detection with posterior sampling. In ICML, 2022.
- Mining data impressions from deep models as substitute for the unavailable training data. IEEE TPAMI, 2021.
- Open set domain adaptation. In ICCV, 2017.
- Visda: The visual domain adaptation challenge. arXiv preprint arXiv:1710.06924, 2017.
- Moment matching for multi-source domain adaptation. In ICCV, 2019.
- Efficient domain generalization via common-specific low-rank decomposition. In ICML. PMLR, 2020.
- Bmd: A general class-balanced multicentric dynamic prototype strategy for source-free domain adaptation. In ECCV, 2022.
- Modality-agnostic debiasing for single domain generalization. In CVPR, 2023a.
- Upcycling models under domain and category shift. In CVPR, 2023b.
- Faster r-cnn: Towards real-time object detection with region proposal networks. In NeurIPS, 2015.
- Peter J Rousseeuw. Silhouettes: a graphical aid to the interpretation and validation of cluster analysis. Journal of computational and applied mathematics, 20:53–65, 1987.
- Adapting visual category models to new domains. In ECCV, 2010.
- Ovanet: One-vs-all network for universal domain adaptation. In ICCV, 2021.
- Maximum classifier discrepancy for unsupervised domain adaptation. In CVPR, 2018a.
- Open set domain adaptation by backpropagation. In ECCV, 2018b.
- Universal domain adaptation through self supervision. In NeurIPS, pages 16282–16292, 2020.
- Claude Elwood Shannon. A mathematical theory of communication. The Bell system technical journal, 27(3):379–423, 1948.
- Prior knowledge guided unsupervised domain adaptation. In ECCV, 2022.
- Vdm-da: Virtual domain modeling for source data-free domain adaptation. IEEE TCSVT, 2021.
- Dacs: Domain adaptation via cross-domain mixed sampling. In WACV, 2021.
- Deep hashing network for unsupervised domain adaptation. In CVPR, 2017.
- Paul Voigt and Axel Von dem Bussche. The eu general data protection regulation (gdpr). A Practical Guide, 1st Ed., Cham: Springer International Publishing, 10(3152676):10–5555, 2017.
- High-frequency component helps explain the generalization of convolutional neural networks. In CVPR, 2020.
- Source data-free cross-domain semantic segmentation: Align, teach and propagate. arXiv preprint arXiv:2106.11653, 2021.
- Mitigating neural network overconfidence with logit normalization. In ICML, 2022.
- Semantically coherent out-of-distribution detection. In ICCV, 2021a.
- Generalized source-free domain adaptation. In ICCV, 2021b.
- Exploiting the intrinsic neighborhood structure for source-free domain adaptation. In NeurIPS, 2021c.
- Universal domain adaptation. In CVPR, 2019.
- Divide and contrast: Source-free domain adaptation via adaptive contrastive learning. In NeurIPS, 2022.
- Sanqing Qu (20 papers)
- Tianpei Zou (6 papers)
- Lianghua He (23 papers)
- Florian Röhrbein (5 papers)
- Alois Knoll (190 papers)
- Guang Chen (86 papers)
- Changjun Jiang (47 papers)