Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 81 tok/s
Gemini 2.5 Pro 45 tok/s Pro
GPT-5 Medium 14 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 86 tok/s Pro
GPT OSS 120B 446 tok/s Pro
Kimi K2 145 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
Grok 4 Pro
2000 character limit reached

Query-Subquery Nets (1201.2564v1)

Published 12 Jan 2012 in cs.DB and cs.LO

Abstract: We formulate query-subquery nets and use them to create the first framework for developing algorithms for evaluating queries to Horn knowledge bases with the properties that: the approach is goal-directed; each subquery is processed only once and each supplement tuple, if desired, is transferred only once; operations are done set-at-a-time; and any control strategy can be used. Our intention is to increase efficiency of query processing by eliminating redundant computation, increasing flexibility and reducing the number of accesses to the secondary storage. The framework forms a generic evaluation method called QSQN. To deal with function symbols, we use a term-depth bound for atoms and substitutions occurring in the computation and propose to use iterative deepening search which iteratively increases the term-depth bound. We prove soundness and completeness of our generic evaluation method and show that, when the term-depth bound is fixed, the method has PTIME data complexity. We also present how tail recursion elimination can be incorporated into our framework and propose two exemplary control strategies, one is to reduce the number of accesses to the secondary storage, while the other is depth-first search.

Citations (11)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Top Community Prompts

ELI14
off on
Conceptual Simplification
off on
Knowledge Gaps
off on
Future Research Directions
off on
Practical Applications
off on
Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube