Papers
Topics
Authors
Recent
Search
2000 character limit reached

SALT: Subspace Alignment as an Auxiliary Learning Task for Domain Adaptation

Published 11 Jun 2019 in stat.ML, cs.CV, and cs.LG | (1906.04338v2)

Abstract: Unsupervised domain adaptation aims to transfer and adapt knowledge learned from a labeled source domain to an unlabeled target domain. Key components of unsupervised domain adaptation include: (a) maximizing performance on the target, and (b) aligning the source and target domains. Traditionally, these tasks have either been considered as separate, or assumed to be implicitly addressed together with high-capacity feature extractors. When considered separately, alignment is usually viewed as a problem of aligning data distributions, either through geometric approaches such as subspace alignment or through distributional alignment such as optimal transport. This paper represents a hybrid approach, where we assume simplified data geometry in the form of subspaces, and consider alignment as an auxiliary task to the primary task of maximizing performance on the source. The alignment is made rather simple by leveraging tractable data geometry in the form of subspaces. We synergistically allow certain parameters derived from the closed-form auxiliary solution, to be affected by gradients from the primary task. The proposed approach represents a unique fusion of geometric and model-based alignment with gradients from a data-driven primary task. Our approach termed SALT, is a simple framework that achieves comparable or sometimes outperforms state-of-the-art on multiple standard benchmarks.

Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.