Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Probabilistic Logic Programming in Discrete-Continuous Domains (2303.04660v2)

Published 8 Mar 2023 in cs.AI, cs.LG, cs.LO, cs.PL, and cs.SC

Abstract: Neural-symbolic AI (NeSy) allows neural networks to exploit symbolic background knowledge in the form of logic. It has been shown to aid learning in the limited data regime and to facilitate inference on out-of-distribution data. Probabilistic NeSy focuses on integrating neural networks with both logic and probability theory, which additionally allows learning under uncertainty. A major limitation of current probabilistic NeSy systems, such as DeepProbLog, is their restriction to finite probability distributions, i.e., discrete random variables. In contrast, deep probabilistic programming (DPP) excels in modelling and optimising continuous probability distributions. Hence, we introduce DeepSeaProbLog, a neural probabilistic logic programming language that incorporates DPP techniques into NeSy. Doing so results in the support of inference and learning of both discrete and continuous probability distributions under logical constraints. Our main contributions are 1) the semantics of DeepSeaProbLog and its corresponding inference algorithm, 2) a proven asymptotically unbiased learning algorithm, and 3) a series of experiments that illustrate the versatility of our approach.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Lennert De Smet (5 papers)
  2. Pedro Zuidberg Dos Martires (22 papers)
  3. Robin Manhaeve (12 papers)
  4. Giuseppe Marra (39 papers)
  5. Angelika Kimmig (25 papers)
  6. Luc De Raedt (55 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.