Classify and Generate Reciprocally: Simultaneous Positive-Unlabelled Learning and Conditional Generation with Extra Data
Abstract: The scarcity of class-labeled data is a ubiquitous bottleneck in many machine learning problems. While abundant unlabeled data typically exist and provide a potential solution, it is highly challenging to exploit them. In this paper, we address this problem by leveraging Positive-Unlabeled~(PU) classification and the conditional generation with extra unlabeled data \emph{simultaneously}. In particular, we present a novel training framework to jointly target both PU classification and conditional generation when exposed to extra data, especially out-of-distribution unlabeled data, by exploring the interplay between them: 1) enhancing the performance of PU classifiers with the assistance of a novel Classifier-Noise-Invariant Conditional GAN~(CNI-CGAN) that is robust to noisy labels, 2) leveraging extra data with predicted labels from a PU classifier to help the generation. Theoretically, we prove the optimal condition of CNI-CGAN, and experimentally, we conducted extensive evaluations on diverse datasets, verifying the simultaneous improvements in both classification and generation.
- Learning from positive and unlabeled data: A survey. Machine Learning, 109(4):719–760, 2020.
- Mixmatch: A holistic approach to semi-supervised learning. In Advances in Neural Information Processing Systems, pages 5050–5060, 2019.
- Generative modeling helps weak supervision (and vice versa). International Conference on Learning Representations, 2023.
- Robust conditional generative adversarial networks. arXiv preprint arXiv:1805.08657, 2018.
- Convex formulation for learning from positive and unlabeled data. In International conference on machine learning, pages 1386–1394, 2015.
- Analysis of learning from positive and unlabeled data. In Advances in neural information processing systems, pages 703–711, 2014.
- Charles Elkan. The foundations of cost-sensitive learning. In International joint conference on artificial intelligence, volume 17, pages 973–978. Lawrence Erlbaum Associates Ltd, 2001.
- Generative adversarial nets. In Advances in neural information processing systems, pages 2672–2680, 2014.
- Improved training of wasserstein gans. In Advances in neural information processing systems, pages 5767–5777, 2017.
- Generative adversarial positive-unlabelled learning. In Jérôme Lang, editor, Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI 2018, July 13-19, 2018, Stockholm, Sweden, pages 2255–2261. ijcai.org, 2018. 10.24963/ijcai.2018/312.
- Label-noise robust generative adversarial networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 2467–2476, 2019.
- Learning from positive and unlabeled data with a selection bias. 2018.
- Positive-unlabeled learning with non-negative risk estimator. In Advances in neural information processing systems, pages 1675–1685, 2017.
- Robust conditional gans under missing or uncertain labels. arXiv preprint arXiv:1906.03579, 2019.
- High-fidelity image generation with fewer labels. nternational Conference on Machine Learning (ICML), 2019.
- Conditional generative adversarial nets. arXiv preprint arXiv:1411.1784, 2014.
- Virtual adversarial training: a regularization method for supervised and semi-supervised learning. IEEE transactions on pattern analysis and machine intelligence, 41(8):1979–1993, 2018.
- Generative pseudo-label refinement for unsupervised domain adaptation. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 3130–3139, 2020.
- Deep generative positive-unlabeled learning under selection bias. In Proceedings of the 29th ACM international conference on information & knowledge management, pages 1155–1164, 2020.
- Image generation from small datasets via batch statistics adaptation. In Proceedings of the IEEE International Conference on Computer Vision, pages 2750–2758, 2019.
- Conditional generative positive and unlabeled learning. Expert Systems with Applications, 224:120046, 2023.
- Improved techniques for training gans. Advances in Neural Information Processing Systems, 2016.
- Patch-level neighborhood interpolation: A general and effective graph-based regularization strategy. arXiv preprint arXiv:1911.09307, 2019.
- Robustness of conditional gans to noisy labels. In Advances in neural information processing systems, pages 10271–10282, 2018.
- Semi-supervised learning by augmented distribution alignment. In Proceedings of the IEEE International Conference on Computer Vision, pages 1466–1475, 2019.
- Generative-discriminative complementary learning. AAAI 2020, 2019.
- Multi-positive and unlabeled learning. In IJCAI, pages 3182–3188, 2017.
- Effective data augmentation with multi-domain learning gans. arXiv preprint arXiv:1912.11597, 2019.
- Wei Li Shaogang Gong Yanbei Chen, Xiatian Zhu. Semi-supervised learning under class distribution mismatch. AAAI 2020, 2019.
- Tangent-normal adversarial regularization for semi-supervised learning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 10676–10684, 2019.
- On leveraging pretrained gans for limited-data generation. arXiv preprint arXiv:2002.11810, 2020.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.