Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Provably Consistent Partial-Label Learning (2007.08929v2)

Published 17 Jul 2020 in cs.LG and stat.ML

Abstract: Partial-label learning (PLL) is a multi-class classification problem, where each training example is associated with a set of candidate labels. Even though many practical PLL methods have been proposed in the last two decades, there lacks a theoretical understanding of the consistency of those methods-none of the PLL methods hitherto possesses a generation process of candidate label sets, and then it is still unclear why such a method works on a specific dataset and when it may fail given a different dataset. In this paper, we propose the first generation model of candidate label sets, and develop two novel PLL methods that are guaranteed to be provably consistent, i.e., one is risk-consistent and the other is classifier-consistent. Our methods are advantageous, since they are compatible with any deep network or stochastic optimizer. Furthermore, thanks to the generation model, we would be able to answer the two questions above by testing if the generation model matches given candidate label sets. Experiments on benchmark and real-world datasets validate the effectiveness of the proposed generation model and two PLL methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Lei Feng (190 papers)
  2. Jiaqi Lv (14 papers)
  3. Bo Han (282 papers)
  4. Miao Xu (43 papers)
  5. Gang Niu (125 papers)
  6. Xin Geng (90 papers)
  7. Bo An (128 papers)
  8. Masashi Sugiyama (286 papers)
Citations (130)

Summary

We haven't generated a summary for this paper yet.