Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SOS: Score-based Oversampling for Tabular Data (2206.08555v1)

Published 17 Jun 2022 in cs.LG and cs.AI

Abstract: Score-based generative models (SGMs) are a recent breakthrough in generating fake images. SGMs are known to surpass other generative models, e.g., generative adversarial networks (GANs) and variational autoencoders (VAEs). Being inspired by their big success, in this work, we fully customize them for generating fake tabular data. In particular, we are interested in oversampling minor classes since imbalanced classes frequently lead to sub-optimal training outcomes. To our knowledge, we are the first presenting a score-based tabular data oversampling method. Firstly, we re-design our own score network since we have to process tabular data. Secondly, we propose two options for our generation method: the former is equivalent to a style transfer for tabular data and the latter uses the standard generative policy of SGMs. Lastly, we define a fine-tuning method, which further enhances the oversampling quality. In our experiments with 6 datasets and 10 baselines, our method outperforms other oversampling methods in all cases.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jayoung Kim (9 papers)
  2. Chaejeong Lee (5 papers)
  3. Yehjin Shin (5 papers)
  4. Sewon Park (16 papers)
  5. Minjung Kim (30 papers)
  6. Noseong Park (78 papers)
  7. Jihoon Cho (11 papers)
Citations (23)

Summary

We haven't generated a summary for this paper yet.