Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stochastic Generalized Adversarial Label Learning (1906.00512v2)

Published 3 Jun 2019 in cs.LG and stat.ML

Abstract: The usage of machine learning models has grown substantially and is spreading into several application domains. A common need in using machine learning models is collecting the data required to train these models. In some cases, labeling a massive dataset can be a crippling bottleneck, so there is need to develop models that work when training labels for large amounts of data are not easily obtained. A possible solution is weak supervision, which uses noisy labels that are easily obtained from multiple sources. The challenge is how best to combine these noisy labels and train a model to perform well given a task. In this paper, we propose stochastic generalized adversarial label learning (Stoch-GALL), a framework for training machine learning models that perform well when noisy and possibly correlated labels are provided. Our framework allows users to provide different weak labels and multiple constraints on these labels. Our model then attempts to learn parameters for the data by solving a non-zero sum game optimization. The game is between an adversary that chooses labels for the data and a model that minimizes the error made by the adversarial labels. We test our method on three datasets by training convolutional neural network models that learn to classify image objects with limited access to training labels. Our approach is able to learn even in settings where the weak supervision confounds state-of-the-art weakly supervised learning methods. The results of our experiments demonstrate the applicability of this approach to general classification tasks.

Summary

We haven't generated a summary for this paper yet.