Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
121 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

NC-TTT: A Noise Contrastive Approach for Test-Time Training (2404.08392v1)

Published 12 Apr 2024 in cs.CV and cs.LG

Abstract: Despite their exceptional performance in vision tasks, deep learning models often struggle when faced with domain shifts during testing. Test-Time Training (TTT) methods have recently gained popularity by their ability to enhance the robustness of models through the addition of an auxiliary objective that is jointly optimized with the main task. Being strictly unsupervised, this auxiliary objective is used at test time to adapt the model without any access to labels. In this work, we propose Noise-Contrastive Test-Time Training (NC-TTT), a novel unsupervised TTT technique based on the discrimination of noisy feature maps. By learning to classify noisy views of projected feature maps, and then adapting the model accordingly on new domains, classification performance can be recovered by an important margin. Experiments on several popular test-time adaptation baselines demonstrate the advantages of our method compared to recent approaches for this task. The code can be found at:https://github.com/GustavoVargasHakim/NCTTT.git

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. Do CIFAR-10 classifiers generalize to cifar-10? CoRR, abs/1806.00451, 2018. URL http://arxiv.org/abs/1806.00451.
  2. Visda: A synthetic-to-real benchmark for visual domain adaptation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, June 2018.
  3. Generalizing to unseen domains via adversarial data augmentation. Advances in neural information processing systems, 31, 2018.
  4. Structured domain randomization: Bridging the reality gap by context-aware synthetic data. In 2019 International Conference on Robotics and Automation (ICRA), pages 7249–7255. IEEE, 2019.
  5. Domain generalization with mixstyle. In International Conference on Learning Representations, 2020.
  6. A broad study of pre-training for domain generalization and adaptation. In Shai Avidan, Gabriel Brostow, Moustapha Cissé, Giovanni Maria Farinella, and Tal Hassner, editors, Computer Vision – ECCV 2022, pages 621–638, Cham, 2022. Springer Nature Switzerland. ISBN 978-3-031-19827-4.
  7. Generalizing to unseen domains: A survey on domain generalization. IEEE Transactions on Knowledge and Data Engineering, 2022.
  8. Tent: Fully test-time adaptation by entropy minimization. 2021. URL https://openreview.net/forum?id=uXl3bZLkr3c.
  9. Sita: single image test-time adaptation. arXiv:2112.02355 [cs], December 2021. URL http://arxiv.org/abs/2112.02355. arXiv: 2112.02355.
  10. Parameter-free online test-time adaptation. In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 8344–8353, 2022.
  11. Test-time training with self-supervision for generalization under distribution shifts. In International Conference on Machine Learning (ICML), 2020.
  12. Ttt++: When does self-supervised test-time training fail or thrive? Neural Information Processing Systems (NeurIPS), 2021.
  13. Test-time training with masked autoencoders. In Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho, editors, Advances in Neural Information Processing Systems, 2022. URL https://openreview.net/forum?id=SHMi1b7sjXk.
  14. Tttflow: Unsupervised test-time training with normalizing flow. In 2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), pages 2125–2126, Los Alamitos, CA, USA, jan 2023. IEEE Computer Society. doi: 10.1109/WACV56688.2023.00216. URL https://doi.ieeecomputersociety.org/10.1109/WACV56688.2023.00216.
  15. ClusT3: Information Invariant Test-Time Training. pages 6136–6145, 2023. URL https://openaccess.thecvf.com/content/ICCV2023/html/Hakim_ClusT3_Information_Invariant_Test-Time_Training_ICCV_2023_paper.html.
  16. Learning word embeddings efficiently with noise-contrastive estimation. Advances in neural information processing systems, 26, 2013.
  17. Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748, 2018.
  18. A contrastive learning approach for training variational autoencoder priors. Advances in neural information processing systems, 34:480–493, 2021.
  19. Noise-contrastive estimation: A new estimation principle for unnormalized statistical models. In Proceedings of the thirteenth international conference on artificial intelligence and statistics, pages 297–304. JMLR Workshop and Conference Proceedings, 2010.
  20. Evaluating prediction-time batch normalization for robustness under covariate shift. arXiv:2006.10963 [cs, stat], January 2021. URL http://arxiv.org/abs/2006.10963. arXiv: 2006.10963.
  21. A comprehensive survey on test-time adaptation under distribution shifts. arXiv preprint arXiv:2303.15361, 2023.
  22. Masked autoencoders are scalable vision learners. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 16000–16009, 2022.
  23. Density estimation using real nvp. arXiv preprint arXiv:1605.08803, 2016.
  24. Glow: Generative flow with invertible 1x1 convolutions. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018. URL https://proceedings.neurips.cc/paper/2018/file/d139db6a236200b21cc7f752979132d0-Paper.pdf.
  25. Benchmarking neural network robustness to common corruptions and perturbations. 2019.
  26. Imagenet: A large-scale hierarchical image database. In 2009 IEEE conference on computer vision and pattern recognition, pages 248–255. Ieee, 2009.
  27. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2016.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets