Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semi-Supervised Hypothesis Transfer for Source-Free Domain Adaptation (2107.06735v1)

Published 14 Jul 2021 in cs.CV

Abstract: Domain Adaptation has been widely used to deal with the distribution shift in vision, language, multimedia etc. Most domain adaptation methods learn domain-invariant features with data from both domains available. However, such a strategy might be infeasible in practice when source data are unavailable due to data-privacy concerns. To address this issue, we propose a novel adaptation method via hypothesis transfer without accessing source data at adaptation stage. In order to fully use the limited target data, a semi-supervised mutual enhancement method is proposed, in which entropy minimization and augmented label propagation are used iteratively to perform inter-domain and intra-domain alignments. Compared with state-of-the-art methods, the experimental results on three public datasets demonstrate that our method gets up to 19.9% improvements on semi-supervised adaptation tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Ning Ma (39 papers)
  2. Jiajun Bu (52 papers)
  3. Lixian Lu (1 paper)
  4. Jun Wen (31 papers)
  5. Zhen Zhang (384 papers)
  6. Sheng Zhou (186 papers)
  7. Xifeng Yan (52 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.