Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Parameterized Complexity of Learning First-Order Logic (2102.12201v2)

Published 24 Feb 2021 in cs.LO

Abstract: We analyse the complexity of learning first-order queries in a model-theoretic framework for supervised learning introduced by (Grohe and Tur\'an, TOCS 2004). Previous research on the complexity of learning in this framework focussed on the question of when learning is possible in time sublinear in the background structure. Here we study the parameterized complexity of the learning problem. We have two main results. The first is a hardness result, showing that learning first-order queries is at least as hard as the corresponding model-checking problem, which implies that on general structures it is hard for the parameterized complexity class AW[*]. Our second main contribution is a fixed-parameter tractable agnostic PAC learning algorithm for first-order queries over sparse relational data (more precisely, over nowhere dense background structures).

Citations (7)

Summary

We haven't generated a summary for this paper yet.