Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Configuration of More and Less Expressive Logic Programs (2203.01024v1)

Published 2 Mar 2022 in cs.AI

Abstract: The decoupling between the representation of a certain problem, i.e., its knowledge model, and the reasoning side is one of main strong points of model-based AI. This allows, e.g. to focus on improving the reasoning side by having advantages on the whole solving process. Further, it is also well-known that many solvers are very sensitive to even syntactic changes in the input. In this paper, we focus on improving the reasoning side by taking advantages of such sensitivity. We consider two well-known model-based AI methodologies, SAT and ASP, define a number of syntactic features that may characterise their inputs, and use automated configuration tools to reformulate the input formula or program. Results of a wide experimental analysis involving SAT and ASP domains, taken from respective competitions, show the different advantages that can be obtained by using input reformulation and configuration. Under consideration in Theory and Practice of Logic Programming (TPLP).

Citations (1)

Summary

We haven't generated a summary for this paper yet.