Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust-GBDT: GBDT with Nonconvex Loss for Tabular Classification in the Presence of Label Noise and Class Imbalance (2310.05067v2)

Published 8 Oct 2023 in cs.LG

Abstract: Dealing with label noise in tabular classification tasks poses a persistent challenge in machine learning. While robust boosting methods have shown promise in binary classification, their effectiveness in complex, multi-class scenarios is often limited. Additionally, issues like imbalanced datasets, missing values, and computational inefficiencies further complicate their practical utility. This study introduces Robust-GBDT, a groundbreaking approach that combines the power of Gradient Boosted Decision Trees (GBDT) with the resilience of nonconvex loss functions against label noise. By leveraging local convexity within specific regions, Robust-GBDT demonstrates unprecedented robustness, challenging conventional wisdom. Through seamless integration of advanced GBDT with a novel Robust Focal Loss tailored for class imbalance, Robust-GBDT significantly enhances generalization capabilities, particularly in noisy and imbalanced datasets. Notably, its user-friendly design facilitates integration with existing open-source code, enhancing computational efficiency and scalability. Extensive experiments validate Robust-GBDT's superiority over other noise-robust methods, establishing a new standard for accurate classification amidst label noise. This research heralds a paradigm shift in machine learning, paving the way for a new era of robust and precise classification across diverse real-world applications.

Citations (1)

Summary

We haven't generated a summary for this paper yet.