Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

RelationMatch: Matching In-batch Relationships for Semi-supervised Learning (2305.10397v3)

Published 17 May 2023 in cs.LG and cs.CV

Abstract: Semi-supervised learning has emerged as a pivotal approach for leveraging scarce labeled data alongside abundant unlabeled data. Despite significant progress, prevailing SSL methods predominantly enforce consistency between different augmented views of individual samples, thereby overlooking the rich relational structure inherent within a mini-batch. In this paper, we present RelationMatch, a novel SSL framework that explicitly enforces in-batch relational consistency through a Matrix Cross-Entropy (MCE) loss function. The proposed MCE loss is rigorously derived from both matrix analysis and information geometry perspectives, ensuring theoretical soundness and practical efficacy. Extensive empirical evaluations on standard benchmarks, including a notable 15.21% accuracy improvement over FlexMatch on STL-10, demonstrate that RelationMatch not only advances state-of-the-art performance but also provides a principled foundation for incorporating relational cues in SSL.

Citations (7)

Summary

We haven't generated a summary for this paper yet.