ABSTRACT

     In 1985 Terrence Langendoen and Paul Postal argued that the output of natural grammars, that is natural languages, had a degree of infinity that was larger than that of any set in mathematics.  While their work seemed arcane, and has not been widely taken up since, they did address a fundamental question that lies at the formal heart of generative grammar: what size of set does a grammar generate, or, how “big” is language?

     McCawley (1987) criticized their work on the grounds that an infinitely long sentence was ungrammatical, his objection being a linguistic version of what in logic is termed the halting problem for a Turing machine.  Langendoen and Postal could bypass McCawley’s criticism by taking refuge in the distinction between a potential and an actual infinite.  The nature of generative grammar would remain the same in either case.

     In fact there is a hitherto unrecognized flaw in Langendoen and Postal’s work that arises when they filter the output of their “machine” (a power set operation) that makes bigger sets from smaller ones (pp. 56-7). The resulting filtered set reverts back to a size, or “cardinality,” that is the same as the original set of infinitely long sentences, this cardinality being what in math is called “denumerable,” or in set theory “aleph-null.”  Their effort to advance language to unbounded infinite heights stalls at the lowest level of infinity, aleph-null.  Their hierarchy cannot be extended.

     A simple and natural model of language use is put forward here: language is used in context to produce acceptable or unacceptable instances of a utterance – context pair.  The context model is an extensional one that uses space-time as a natural base for an abstract space of “speech.”  It can then be shown that the human ability to use language uses a generative capacity whose output is greater than aleph-null, that is, its cardinality is at least as large as that of the continuum (aleph-1) and probably that of all relationships on the continuum (aleph-2).

     These findings form a natural generalization and extension of the Chomsky hierarchy to “super-grammars.”  They also complete Chomsky’s initial demands for generative grammar: an explanation not only for the infinite use of language, but also its spontaneous, and creative use, the last two of which have to date only been partially met.  

Keywords: Vastness, recursion, generative paradigm, Chomsky hierarchy, performance,  set theory, Turing test, Turing machines, degrees of infinity, super grammars.   

The full text in PDF can be downloaded by clicking here