Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Supervised Contrastive Learning with Tree-Structured Parzen Estimator Bayesian Optimization for Imbalanced Tabular Data (2210.10824v2)

Published 19 Oct 2022 in cs.LG

Abstract: Class imbalance has a detrimental effect on the predictive performance of most supervised learning algorithms as the imbalanced distribution can lead to a bias preferring the majority class. To solve this problem, we propose a Supervised Contrastive Learning (SCL) method with Tree-structured Parzen Estimator (TPE) technique for imbalanced tabular datasets. Contrastive learning (CL) can extract the information hidden in data even without labels and has shown some potential for imbalanced learning tasks. SCL further considers the label information based on CL, which also addresses the insufficient data augmentation techniques of tabular data. Therefore, in this work, we propose to use SCL to learn a discriminative representation of imbalanced tabular data. Additionally, the hyper-parameter temperature of SCL has a decisive influence on the performance and is difficult to tune. We introduce TPE, a well-known Bayesian optimization technique, to automatically select the best temperature. Experiments are conducted on both binary and multi-class imbalanced tabular datasets. As shown in the results obtained, TPE outperforms three other hyper-parameter optimization (HPO) methods such as grid search, random search, and genetic algorithm. More importantly, the proposed SCL-TPE method achieves much-improved performance compared with the state-of-the-art methods.

Summary

We haven't generated a summary for this paper yet.