Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Inductive Logic Boosting (1402.6077v1)

Published 25 Feb 2014 in cs.LG and cs.AI

Abstract: Recent years have seen a surge of interest in Probabilistic Logic Programming (PLP) and Statistical Relational Learning (SRL) models that combine logic with probabilities. Structure learning of these systems is an intersection area of Inductive Logic Programming (ILP) and statistical learning (SL). However, ILP cannot deal with probabilities, SL cannot model relational hypothesis. The biggest challenge of integrating these two machine learning frameworks is how to estimate the probability of a logic clause only from the observation of grounded logic atoms. Many current methods models a joint probability by representing clause as graphical model and literals as vertices in it. This model is still too complicate and only can be approximate by pseudo-likelihood. We propose Inductive Logic Boosting framework to transform the relational dataset into a feature-based dataset, induces logic rules by boosting Problog Rule Trees and relaxes the independence constraint of pseudo-likelihood. Experimental evaluation on benchmark datasets demonstrates that the AUC-PR and AUC-ROC value of ILP learned rules are higher than current state-of-the-art SRL methods.

Summary

We haven't generated a summary for this paper yet.