Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Unsupervised Domain Adaptation with Variational Information Bottleneck (1911.09310v1)

Published 21 Nov 2019 in cs.LG and stat.ML

Abstract: Domain adaptation aims to leverage the supervision signal of source domain to obtain an accurate model for target domain, where the labels are not available. To leverage and adapt the label information from source domain, most existing methods employ a feature extracting function and match the marginal distributions of source and target domains in a shared feature space. In this paper, from the perspective of information theory, we show that representation matching is actually an insufficient constraint on the feature space for obtaining a model with good generalization performance in target domain. We then propose variational bottleneck domain adaptation (VBDA), a new domain adaptation method which improves feature transferability by explicitly enforcing the feature extractor to ignore the task-irrelevant factors and focus on the information that is essential to the task of interest for both source and target domains. Extensive experimental results demonstrate that VBDA significantly outperforms state-of-the-art methods across three domain adaptation benchmark datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Yuxuan Song (31 papers)
  2. Lantao Yu (32 papers)
  3. Zhangjie Cao (34 papers)
  4. Zhiming Zhou (24 papers)
  5. Jian Shen (68 papers)
  6. Shuo Shao (35 papers)
  7. Weinan Zhang (322 papers)
  8. Yong Yu (219 papers)
Citations (16)

Summary

We haven't generated a summary for this paper yet.