Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The teaching complexity of erasing pattern languages with bounded variable frequency (1905.07737v1)

Published 19 May 2019 in cs.FL

Abstract: Patterns provide a concise, syntactic way of describing a set of strings, but their expressive power comes at a price: a number of fundamental decision problems concerning (erasing) pattern languages, such as the membership problem and inclusion problem, are known to be NP-complete or even undecidable, while the decidability of the equivalence problem is still open; in learning theory, the class of pattern languages is unlearnable in models such as the distribution-free (PAC) framework (if $\mathcal{P}/poly \neq \mathcal{NP}/poly$). Much work on the algorithmic learning of pattern languages has thus focussed on interesting subclasses of patterns for which positive learnability results may be achieved. A natural restriction on a pattern is a bound on its variable frequency -- the maximum number $m$ such that some variable occurs exactly $m$ times in the pattern. This paper examines the effect of limiting the variable frequency of all patterns belonging to a class $\Pi$ on the worst-case minimum number of labelled examples needed to uniquely identify any pattern of $\Pi$ in cooperative teaching-learning models. Two such models, the teaching dimension model as well as the preference-based teaching model, will be considered.

Summary

We haven't generated a summary for this paper yet.