Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Pruning-Based Extraction of Descriptions from Probabilistic Circuits (2311.13379v2)

Published 22 Nov 2023 in cs.AI

Abstract: Concept learning is a general task with applications in various domains. As a motivating example we consider the application of music playlist generation, where a playlist is represented as a concept (e.g., `relaxing music') rather than as a fixed collection of songs. In this work we use a probabilistic circuit to learn a concept from positively labelled and unlabelled examples. While these circuits form an attractive tractable model for this task, it is challenging for a domain expert to inspect and analyse them, which impedes their use within certain applications. We propose to resolve this by converting a learned probabilistic circuit into a logic-based discriminative model that covers the high density regions of the circuit. That is, those regions the circuit classifies as certainly being part of the learned concept. As part of this approach we present two contributions: PUTPUT, an algorithm to prune low density regions from a probabilistic circuit while considering both the F1-score and a newly proposed description length that we call aggregated entropy. Our experiments demonstrate the effectiveness of our approach in providing discriminative models, outperforming competitors on the music playlist generation task and similar datasets.

Summary

We haven't generated a summary for this paper yet.