Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lifted Inference with Linear Order Axiom (2211.01164v1)

Published 2 Nov 2022 in cs.AI and cs.LO

Abstract: We consider the task of weighted first-order model counting (WFOMC) used for probabilistic inference in the area of statistical relational learning. Given a formula $\phi$, domain size $n$ and a pair of weight functions, what is the weighted sum of all models of $\phi$ over a domain of size $n$? It was shown that computing WFOMC of any logical sentence with at most two logical variables can be done in time polynomial in $n$. However, it was also shown that the task is $\texttt{#}P_1$-complete once we add the third variable, which inspired the search for extensions of the two-variable fragment that would still permit a running time polynomial in $n$. One of such extension is the two-variable fragment with counting quantifiers. In this paper, we prove that adding a linear order axiom (which forces one of the predicates in $\phi$ to introduce a linear ordering of the domain elements in each model of $\phi$) on top of the counting quantifiers still permits a computation time polynomial in the domain size. We present a new dynamic programming-based algorithm which can compute WFOMC with linear order in time polynomial in $n$, thus proving our primary claim.

Citations (10)

Summary

We haven't generated a summary for this paper yet.