Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
98 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semantic Borrowing for Generalized Zero-Shot Learning (2102.04969v2)

Published 30 Jan 2021 in cs.LG, cs.AI, and cs.CV

Abstract: Generalized zero-shot learning (GZSL) is one of the most realistic but challenging problems due to the partiality of the classifier to supervised classes, especially under the class-inductive instance-inductive (CIII) training setting, where testing data are not available. Instance-borrowing methods and synthesizing methods solve it to some extent with the help of testing semantics, but therefore neither can be used under CIII. Besides, the latter require the training process of a classifier after generating examples. In contrast, a novel non-transductive regularization under CIII called Semantic Borrowing (SB) for improving GZSL methods with compatibility metric learning is proposed in this paper, which not only can be used for training linear models, but also nonlinear ones such as artificial neural networks. This regularization item in the loss function borrows similar semantics in the training set, so that the classifier can model the relationship between the semantics of zero-shot and supervised classes more accurately during training. In practice, the information of semantics of unknown classes would not be available for training while this approach does NOT need it. Extensive experiments on GZSL benchmark datasets show that SB can reduce the partiality of the classifier to supervised classes and improve the performance of generalized zero-shot classification, surpassing inductive GZSL state of the arts.

Citations (1)

Summary

We haven't generated a summary for this paper yet.