Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DiDA: Disentangled Synthesis for Domain Adaptation (1805.08019v1)

Published 21 May 2018 in cs.CV

Abstract: Unsupervised domain adaptation aims at learning a shared model for two related, but not identical, domains by leveraging supervision from a source domain to an unsupervised target domain. A number of effective domain adaptation approaches rely on the ability to extract discriminative, yet domain-invariant, latent factors which are common to both domains. Extracting latent commonality is also useful for disentanglement analysis, enabling separation between the common and the domain-specific features of both domains. In this paper, we present a method for boosting domain adaptation performance by leveraging disentanglement analysis. The key idea is that by learning to separately extract both the common and the domain-specific features, one can synthesize more target domain data with supervision, thereby boosting the domain adaptation performance. Better common feature extraction, in turn, helps further improve the disentanglement analysis and disentangled synthesis. We show that iterating between domain adaptation and disentanglement analysis can consistently improve each other on several unsupervised domain adaptation tasks, for various domain adaptation backbone models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jinming Cao (7 papers)
  2. Oren Katzir (6 papers)
  3. Peng Jiang (274 papers)
  4. Dani Lischinski (56 papers)
  5. Danny Cohen-Or (4 papers)
  6. Changhe Tu (42 papers)
  7. Yangyan Li (16 papers)
Citations (34)

Summary

We haven't generated a summary for this paper yet.