Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Source-Free Domain Adaptation with Diffusion-Guided Source Data Generation (2402.04929v3)

Published 7 Feb 2024 in cs.CV, cs.AI, and cs.LG

Abstract: This paper introduces a novel approach to leverage the generalizability of Diffusion Models for Source-Free Domain Adaptation (DM-SFDA). Our proposed DMSFDA method involves fine-tuning a pre-trained text-to-image diffusion model to generate source domain images using features from the target images to guide the diffusion process. Specifically, the pre-trained diffusion model is fine-tuned to generate source samples that minimize entropy and maximize confidence for the pre-trained source model. We then use a diffusion model-based image mixup strategy to bridge the domain gap between the source and target domains. We validate our approach through comprehensive experiments across a range of datasets, including Office-31, Office-Home, and VisDA. The results demonstrate significant improvements in SFDA performance, highlighting the potential of diffusion models in generating contextually relevant, domain-specific images.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (55)
  1. Analysis of representations for domain adaptation. In Schölkopf, B., Platt, J., and Hoffman, T. (eds.), Advances in Neural Information Processing Systems, volume 19. MIT Press, 2006. URL https://proceedings.neurips.cc/paper/2006/file/b1b0432ceafb0ce714426e9114852ac7-Paper.pdf.
  2. One-shot unsupervised domain adaptation with personalized diffusion models. In 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp.  698–708, 2023. doi: 10.1109/CVPRW59228.2023.00077.
  3. Domain adaptive multibranch networks. In ICLR, 2020.
  4. Training diffusion models with reinforcement learning, 2023.
  5. Source-free domain adaptation via distribution estimation, 2022. URL https://arxiv.org/abs/2204.11257.
  6. Unsupervised domain adaptation by backpropagation, 2014. URL https://arxiv.org/abs/1409.7495.
  7. Unsupervised domain adaptation by backpropagation. In Proceedings of the 32nd International Conference on International Conference on Machine Learning - Volume 37, ICML’15, pp.  1180–1189. JMLR.org, 2015a.
  8. Unsupervised domain adaptation by backpropagation. In Proceedings of the 32nd International Conference on International Conference on Machine Learning - Volume 37, ICML’15, pp.  1180–1189. JMLR.org, 2015b.
  9. Domain-adversarial training of neural networks. J. Mach. Learn. Res., 17(1):2096–2030, jan 2016. ISSN 1532-4435.
  10. Diffusion models as plug-and-play priors. In Thirty-Sixth Conference on Neural Information Processing Systems, 2022. URL https://arxiv.org/pdf/2206.09012.pdf.
  11. Accelerate: Training and inference at scale made simple, efficient and adaptable. https://github.com/huggingface/accelerate, 2022.
  12. Predicting with confidence on unseen distributions. In 2021 IEEE/CVF International Conference on Computer Vision (ICCV), pp.  1114–1124, 2021. doi: 10.1109/ICCV48922.2021.00117.
  13. Deep residual learning for image recognition, 2015.
  14. Denoising diffusion probabilistic models. CoRR, abs/2006.11239, 2020. URL https://arxiv.org/abs/2006.11239.
  15. Unsupervised domain adaptation with hierarchical gradient synchronization. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 2020.
  16. Correcting sample selection bias by unlabeled data. In Schölkopf, B., Platt, J., and Hoffman, T. (eds.), Advances in Neural Information Processing Systems, volume 19. MIT Press, 2006. URL https://proceedings.neurips.cc/paper/2006/file/a2186aa7c086b46ad4e8bf81e2a3a19b-Paper.pdf.
  17. Aleatoric and epistemic uncertainty in machine learning: an introduction to concepts and methods. Machine Learning, 110(3):457–506, Mar 2021. ISSN 1573-0565. doi: 10.1007/s10994-021-05946-3. URL https://doi.org/10.1007/s10994-021-05946-3.
  18. Less confusion more transferable: Minimum class confusion for versatile domain adaptation. CoRR, abs/1912.03699, 2019. URL http://arxiv.org/abs/1912.03699.
  19. C-sfda: A curriculum learning aided self-training framework for efficient source free domain adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp.  24120–24131, June 2023.
  20. Universal source-free domain adaptation, 2020. URL https://arxiv.org/abs/2004.04393.
  21. Domain impression: A source data free domain adaptation method. In 2021 IEEE Winter Conference on Applications of Computer Vision (WACV), pp.  615–625, 2021. doi: 10.1109/WACV48630.2021.00066.
  22. xformers: A modular and hackable transformer modelling library. https://github.com/facebookresearch/xformers, 2022.
  23. Maximum density divergence for domain adaptation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 43(11):3918–3930, 2021. doi: 10.1109/TPAMI.2020.2991050.
  24. Model adaptation: Unsupervised domain adaptation without source data. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp.  9638–9647, 2020. doi: 10.1109/CVPR42600.2020.00966.
  25. Do we really need to access the source data? source hypothesis transfer for unsupervised domain adaptation. CoRR, abs/2002.08546, 2020. URL https://arxiv.org/abs/2002.08546.
  26. Visual instruction tuning. In NeurIPS, 2023.
  27. Learning transferable features with deep adaptation networks. In Bach, F. and Blei, D. (eds.), Proceedings of the 32nd International Conference on Machine Learning, volume 37 of Proceedings of Machine Learning Research, pp.  97–105, Lille, France, 07–09 Jul 2015a. PMLR. URL https://proceedings.mlr.press/v37/long15.html.
  28. Learning transferable features with deep adaptation networks, 2015b. URL https://arxiv.org/abs/1502.02791.
  29. Deep transfer learning with joint adaptation networks. CoRR, abs/1605.06636, 2016a. URL http://arxiv.org/abs/1605.06636.
  30. Unsupervised domain adaptation with residual transfer networks. In Proceedings of the 30th International Conference on Neural Information Processing Systems, NIPS’16, pp.  136–144, Red Hook, NY, USA, 2016b. Curran Associates Inc. ISBN 9781510838819.
  31. Domain adaptation with randomized multilinear adversarial networks. CoRR, abs/1705.10667, 2017. URL http://arxiv.org/abs/1705.10667.
  32. Conditional adversarial domain adaptation. In Proceedings of the 32nd International Conference on Neural Information Processing Systems, NIPS’18, pp.  1647–1657, Red Hook, NY, USA, 2018. Curran Associates Inc.
  33. Fixbi: Bridging domain spaces for unsupervised domain adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp.  1094–1103, June 2021.
  34. Contrastive vicinal space for unsupervised domain adaptation. In Avidan, S., Brostow, G., Cissé, M., Farinella, G. M., and Hassner, T. (eds.), Computer Vision – ECCV 2022, pp.  92–110, Cham, 2022. Springer Nature Switzerland. ISBN 978-3-031-19830-4.
  35. Visda: The visual domain adaptation challenge, 2017.
  36. Source-free domain adaptation via avatar prototype generation and adaptation. In International Joint Conference on Artificial Intelligence, 2021.
  37. Learning transferable visual models from natural language supervision. In Meila, M. and Zhang, T. (eds.), Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, pp.  8748–8763. PMLR, 18–24 Jul 2021. URL https://proceedings.mlr.press/v139/radford21a.html.
  38. High-resolution image synthesis with latent diffusion models, 2022.
  39. Adapting visual category models to new domains. In Daniilidis, K., Maragos, P., and Paragios, N. (eds.), Computer Vision – ECCV 2010, pp.  213–226, Berlin, Heidelberg, 2010. Springer Berlin Heidelberg. ISBN 978-3-642-15561-1.
  40. Maximum classifier discrepancy for unsupervised domain adaptation. CoRR, abs/1712.02560, 2017. URL http://arxiv.org/abs/1712.02560.
  41. Universal domain adaptation through self supervision. CoRR, abs/2002.07953, 2020. URL https://arxiv.org/abs/2002.07953.
  42. A dirt-t approach to unsupervised domain adaptation, 2018. URL https://arxiv.org/abs/1802.08735.
  43. Deep unsupervised learning using nonequilibrium thermodynamics. CoRR, abs/1503.03585, 2015. URL http://arxiv.org/abs/1503.03585.
  44. Return of frustratingly easy domain adaptation. CoRR, abs/1511.05547, 2015. URL http://arxiv.org/abs/1511.05547.
  45. Unsupervised domain adaptation via structurally regularized deep clustering. CoRR, abs/2003.08607, 2020a. URL https://arxiv.org/abs/2003.08607.
  46. Unsupervised domain adaptation via structurally regularized deep clustering. CoRR, abs/2003.08607, 2020b. URL https://arxiv.org/abs/2003.08607.
  47. Deep domain confusion: Maximizing for domain invariance. CoRR, abs/1412.3474, 2014. URL http://arxiv.org/abs/1412.3474.
  48. Visualizing data using t-sne. Journal of Machine Learning Research, 9(86):2579–2605, 2008. URL http://jmlr.org/papers/v9/vandermaaten08a.html.
  49. Deep hashing network for unsupervised domain adaptation. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp.  5385–5394, Los Alamitos, CA, USA, jul 2017. IEEE Computer Society. doi: 10.1109/CVPR.2017.572. URL https://doi.ieeecomputersociety.org/10.1109/CVPR.2017.572.
  50. Gradual source domain expansion for unsupervised domain adaptation. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), pp.  1946–1955, January 2024.
  51. Open-Vocabulary Panoptic Segmentation with Text-to-Image Diffusion Models. arXiv preprint arXiv:2303.04803, 2023.
  52. Unsupervised domain adaptation: An adaptive feature norm approach. CoRR, abs/1811.07456, 2018. URL http://arxiv.org/abs/1811.07456.
  53. Exploiting the intrinsic neighborhood structure for source-free domain adaptation. CoRR, abs/2110.04202, 2021. URL https://arxiv.org/abs/2110.04202.
  54. Trust your good friends: Source-free domain adaptation by reciprocal neighborhood clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(12):15883–15895, 2023. doi: 10.1109/TPAMI.2023.3310791.
  55. A comprehensive survey on source-free domain adaptation, 2023.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Shivang Chopra (10 papers)
  2. Suraj Kothawade (32 papers)
  3. Houda Aynaou (3 papers)
  4. Aman Chadha (110 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets