Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Query learning of derived $ω$-tree languages in polynomial time (1802.04739v3)

Published 13 Feb 2018 in cs.LO

Abstract: We present the first polynomial time algorithm to learn nontrivial classes of languages of infinite trees. Specifically, our algorithm uses membership and equivalence queries to learn classes of $\omega$-tree languages derived from weak regular $\omega$-word languages in polynomial time. The method is a general polynomial time reduction of learning a class of derived $\omega$-tree languages to learning the underlying class of $\omega$-word languages, for any class of $\omega$-word languages recognized by a deterministic B\"{u}chi acceptor. Our reduction, combined with the polynomial time learning algorithm of Maler and Pnueli [1995] for the class of weak regular $\omega$-word languages yields the main result. We also show that subset queries that return counterexamples can be implemented in polynomial time using subset queries that return no counterexamples for deterministic or non-deterministic finite word acceptors, and deterministic or non-deterministic B\"{u}chi $\omega$-word acceptors. A previous claim of an algorithm to learn regular $\omega$-trees due to Jayasrirani, Begam and Thomas [2008] is unfortunately incorrect, as shown in Angluin [2016].

Citations (3)

Summary

We haven't generated a summary for this paper yet.