Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Probabilistic approach for Learning Embeddings without Supervision (1912.08275v1)

Published 17 Dec 2019 in cs.CV and cs.LG

Abstract: For challenging machine learning problems such as zero-shot learning and fine-grained categorization, embedding learning is the machinery of choice because of its ability to learn generic notions of similarity, as opposed to class-specific concepts in standard classification models. Embedding learning aims at learning discriminative representations of data such that similar examples are pulled closer, while pushing away dissimilar ones. Despite their exemplary performances, supervised embedding learning approaches require huge number of annotations for training. This restricts their applicability for large datasets in new applications where obtaining labels require extensive manual efforts and domain knowledge. In this paper, we propose to learn an embedding in a completely unsupervised manner without using any class labels. Using a graph-based clustering approach to obtain pseudo-labels, we form triplet-based constraints following a metric learning paradigm. Our novel embedding learning approach uses a probabilistic notion, that intuitively minimizes the chances of each triplet violating a geometric constraint. Due to nature of the search space, we learn the parameters of our approach using Riemannian geometry. Our proposed approach performs competitive to state-of-the-art approaches.

Summary

We haven't generated a summary for this paper yet.