Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MixBoost: Synthetic Oversampling with Boosted Mixup for Handling Extreme Imbalance (2009.01571v1)

Published 3 Sep 2020 in cs.LG and stat.ML

Abstract: Training a classification model on a dataset where the instances of one class outnumber those of the other class is a challenging problem. Such imbalanced datasets are standard in real-world situations such as fraud detection, medical diagnosis, and computational advertising. We propose an iterative data augmentation method, MixBoost, which intelligently selects (Boost) and then combines (Mix) instances from the majority and minority classes to generate synthetic hybrid instances that have characteristics of both classes. We evaluate MixBoost on 20 benchmark datasets, show that it outperforms existing approaches, and test its efficacy through significance testing. We also present ablation studies to analyze the impact of the different components of MixBoost.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Anubha Kabra (10 papers)
  2. Ayush Chopra (24 papers)
  3. Nikaash Puri (12 papers)
  4. Pinkesh Badjatiya (9 papers)
  5. Sukriti Verma (7 papers)
  6. Piyush Gupta (35 papers)
  7. Balaji K (1 paper)
Citations (6)

Summary

We haven't generated a summary for this paper yet.