Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
98 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
52 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
15 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
Gemini 2.5 Flash Deprecated
12 tokens/sec
2000 character limit reached

Data-Efficient CLIP-Powered Dual-Branch Networks for Source-Free Unsupervised Domain Adaptation (2410.15811v2)

Published 21 Oct 2024 in cs.CV

Abstract: Source-free Unsupervised Domain Adaptation (SF-UDA) aims to transfer a model's performance from a labeled source domain to an unlabeled target domain without direct access to source samples, addressing critical data privacy concerns. However, most existing SF-UDA approaches assume the availability of abundant source domain samples, which is often impractical due to the high cost of data annotation. To address the dual challenges of limited source data and privacy concerns, we introduce a data-efficient, CLIP-powered dual-branch network (CDBN). This architecture consists of a cross-domain feature transfer branch and a target-specific feature learning branch, leveraging high-confidence target domain samples to transfer text features of source domain categories while learning target-specific soft prompts. By fusing the outputs of both branches, our approach not only effectively transfers source domain category semantic information to the target domain but also reduces the negative impacts of noise and domain gaps during target training. Furthermore, we propose an unsupervised optimization strategy driven by accurate classification and diversity, preserving the classification capability learned from the source domain while generating more confident and diverse predictions in the target domain. CDBN achieves near state-of-the-art performance with far fewer source domain samples than existing methods across 31 transfer tasks on seven datasets.

Summary

We haven't generated a summary for this paper yet.