Papers
Topics
Authors
Recent
Search
2000 character limit reached

The Complexity of Learning Principles and Parameters Grammars

Published 30 Jun 2012 in cs.FL and cs.CL | (1207.0052v3)

Abstract: We investigate models for learning the class of context-free and context-sensitive languages (CFLs and CSLs). We begin with a brief discussion of some early hardness results which show that unrestricted language learning is impossible, and unrestricted CFL learning is computationally infeasible; we then briefly survey the literature on algorithms for learning restricted subclasses of the CFLs. Finally, we introduce a new family of subclasses, the principled parametric context-free grammars (and a corresponding family of principled parametric context-sensitive grammars), which roughly model the "Principles and Parameters" framework in psycholinguistics. We present three hardness results: first, that the PPCFGs are not efficiently learnable given equivalence and membership oracles, second, that the PPCFGs are not efficiently learnable from positive presentations unless P = NP, and third, that the PPCSGs are not efficiently learnable from positive presentations unless integer factorization is in P.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.