Papers
Topics
Authors
Recent
2000 character limit reached

Unsupervised Embedding Learning from Uncertainty Momentum Modeling

Published 19 Jul 2021 in cs.CV | (2107.08892v1)

Abstract: Existing popular unsupervised embedding learning methods focus on enhancing the instance-level local discrimination of the given unlabeled images by exploring various negative data. However, the existed sample outliers which exhibit large intra-class divergences or small inter-class variations severely limit their learning performance. We justify that the performance limitation is caused by the gradient vanishing on these sample outliers. Moreover, the shortage of positive data and disregard for global discrimination consideration also pose critical issues for unsupervised learning but are always ignored by existing methods. To handle these issues, we propose a novel solution to explicitly model and directly explore the uncertainty of the given unlabeled learning samples. Instead of learning a deterministic feature point for each sample in the embedding space, we propose to represent a sample by a stochastic Gaussian with the mean vector depicting its space localization and covariance vector representing the sample uncertainty. We leverage such uncertainty modeling as momentum to the learning which is helpful to tackle the outliers. Furthermore, abundant positive candidates can be readily drawn from the learned instance-specific distributions which are further adopted to mitigate the aforementioned issues. Thorough rationale analyses and extensive experiments are presented to verify our superiority.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.