Papers
Topics
Authors
Recent
2000 character limit reached

Exploiting Data Hierarchy as a New Modality for Contrastive Learning (2401.03312v1)

Published 6 Jan 2024 in cs.CV and cs.AI

Abstract: This work investigates how hierarchically structured data can help neural networks learn conceptual representations of cathedrals. The underlying WikiScenes dataset provides a spatially organized hierarchical structure of cathedral components. We propose a novel hierarchical contrastive training approach that leverages a triplet margin loss to represent the data's spatial hierarchy in the encoder's latent space. As such, the proposed approach investigates if the dataset structure provides valuable information for self-supervised learning. We apply t-SNE to visualize the resultant latent space and evaluate the proposed approach by comparing it with other dataset-specific contrastive learning methods using a common downstream classification task. The proposed method outperforms the comparable weakly-supervised and baseline methods. Our findings suggest that dataset structure is a valuable modality for weakly-supervised learning.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (20)
  1. Learning representations by maximizing mutual information across views, 2019.
  2. Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell., 35(8):1798–1828, Aug. 2013.
  3. A simple framework for contrastive learning of visual representations. In H. D. Iii and A. Singh, editors, Proceedings of the 37th International Conference on Machine Learning, volume 119 of Proceedings of Machine Learning Research, pages 1597–1607. PMLR, 2020a.
  4. Big Self-Supervised models are strong Semi-Supervised learners. June 2020b.
  5. Imagenet: A large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition, pages 248–255, 2009. doi: 10.1109/CVPR.2009.5206848.
  6. Work in progress: Towers of babel: Combining images, language, and 3d geometry for multimodal vision, 2017.
  7. W. Falcon and K. Cho. A framework for contrastive self-supervised learning and designing a new approach. arXiv, pages 1–17, 2020. ISSN 23318422.
  8. Bootstrap your own latent: A new approach to self-supervised learning. June 2020.
  9. Remind your neural network to prevent catastrophic forgetting, 2020.
  10. Deep residual learning for image recognition, 2015.
  11. Momentum contrast for unsupervised visual representation learning. pages 9726–9735, 2020.
  12. Data-Efficient image recognition with contrastive predictive coding. May 2019.
  13. D. P. Kingma and J. Ba. Adam: A method for stochastic optimization, 2017.
  14. Prototypical contrastive learning of unsupervised representations. ICLR, 2021.
  15. M. McCloskey and N. J. Cohen. Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem. volume 24 of Psychology of Learning and Motivation, pages 109–165. Academic Press, 1989. doi: https://doi.org/10.1016/S0079-7421(08)60536-8. URL https://www.sciencedirect.com/science/article/pii/S0079742108605368.
  16. Learning transferable visual models from natural language supervision, 2021.
  17. K. Simonyan and A. Zisserman. Very deep convolutional networks for large-scale image recognition, 2015.
  18. L. van der Maaten and G. Hinton. Visualizing data using t-sne. Journal of Machine Learning Research, 9(86):2579–2605, 2008. URL http://jmlr.org/papers/v9/vandermaaten08a.html.
  19. Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res., 10:207–244, June 2009. ISSN 1532-4435.
  20. Cross-Modal Contrastive Learning for Text-to-Image Generation. (1), 2021. URL http://arxiv.org/abs/2101.04702.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: