Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Discrepancy-Based Active Learning for Domain Adaptation (2103.03757v3)

Published 5 Mar 2021 in cs.LG

Abstract: The goal of the paper is to design active learning strategies which lead to domain adaptation under an assumption of Lipschitz functions. Building on previous work by Mansour et al. (2009) we adapt the concept of discrepancy distance between source and target distributions to restrict the maximization over the hypothesis class to a localized class of functions which are performing accurate labeling on the source domain. We derive generalization error bounds for such active learning strategies in terms of Rademacher average and localized discrepancy for general loss functions which satisfy a regularity condition. A practical K-medoids algorithm that can address the case of large data set is inferred from the theoretical bounds. Our numerical experiments show that the proposed algorithm is competitive against other state-of-the-art active learning techniques in the context of domain adaptation, in particular on large data sets of around one hundred thousand images.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Antoine de Mathelin (10 papers)
  2. Mathilde Mougeot (26 papers)
  3. Nicolas Vayatis (48 papers)
  4. Francois Deheeger (4 papers)
Citations (21)

Summary

We haven't generated a summary for this paper yet.