Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Complexity of Learning Principles and Parameters Grammars (1207.0052v3)

Published 30 Jun 2012 in cs.FL and cs.CL

Abstract: We investigate models for learning the class of context-free and context-sensitive languages (CFLs and CSLs). We begin with a brief discussion of some early hardness results which show that unrestricted language learning is impossible, and unrestricted CFL learning is computationally infeasible; we then briefly survey the literature on algorithms for learning restricted subclasses of the CFLs. Finally, we introduce a new family of subclasses, the principled parametric context-free grammars (and a corresponding family of principled parametric context-sensitive grammars), which roughly model the "Principles and Parameters" framework in psycholinguistics. We present three hardness results: first, that the PPCFGs are not efficiently learnable given equivalence and membership oracles, second, that the PPCFGs are not efficiently learnable from positive presentations unless P = NP, and third, that the PPCSGs are not efficiently learnable from positive presentations unless integer factorization is in P.

Summary

We haven't generated a summary for this paper yet.