Chomskyan linguistics, beginning with his Syntactic Structures, a distillation of his Logical Structure of Linguistic Theory (1955, 75), challenges structural linguistics and introduces transformational grammar. This theory takes utterances (sequences of words) to have a syntax which can be characterized by a formal grammar; in particular, a context-free grammar extended with transformational rules.
Children are hypothesized to have an innate knowledge of the basic grammatical structure common to all human languages (i.e., they assume that any language which they encounter is of a certain restricted kind). This innate knowledge is often referred to as universal grammar. It is argued that modeling knowledge of language using a formal grammar accounts for the "productivity" of language: with a limited set of grammar rules and a finite set of terms, humans are able to produce an infinite number of sentences, including sentences no one has previously said. He has always acknowledged his debt to Pāṇini for his modern notion of an explicit generative grammar. This is related to Rationalist ideas of a priori knowledge, in that it is not due to experience.
The Principles and Parameters approach (P&P)—developed in his Pisa 1979 Lectures, later published as Lectures on Government and Binding (LGB)—make strong claims regarding universal grammar: that the grammatical principles underlying languages are innate and fixed, and the differences among the world's languages can be characterized in terms of parameter settings in the brain (such as the pro-drop parameter, which indicates whether an explicit subject is always required, as in English, or can be optionally dropped, as in Spanish), which are often likened to switches. (Hence the term principles and parameters, often given to this approach.) In this view, a child learning a language need only acquire the necessary lexical items (words, grammatical morphemes, and idioms), and determine the appropriate parameter settings, which can be done based on a few key examples.
Proponents of this view argue that the pace at which children learn languages is inexplicably rapid, unless children have an innate ability to learn languages. The similar steps followed by children all across the world when learning languages, and the fact that children make certain characteristic errors as they learn their first language, whereas other seemingly logical kinds of errors never occur (and, according to Chomsky, should be attested if a purely general, rather than language-specific, learning mechanism were being employed), are also pointed to as motivation for innateness.
More recently, in his Minimalist Program (1995), while retaining the core concept of "principles and parameters," Chomsky attempts a major overhaul of the linguistic machinery involved in the LGB model, stripping from it all but the barest necessary elements, while advocating a general approach to the architecture of the human language faculty that emphasizes principles of economy and optimal design, reverting to a derivational approach to generation, in contrast with the largely representational approach of classic P&P.
Chomsky's ideas have had a strong influence on researchers investigating the acquisition of language in children, though some[specify] researchers who work in this area today do not support Chomsky's theories, instead advocating emergentist or connectionist theories reducing language to an instance of general processing mechanisms in the brain.
He also theorizes that unlimited extension of a language such as English is possible only by the recursive device of embedding sentences in sentences.
His best-known work in phonology is The Sound Pattern of English (1968), written with Morris Halle (and often known as simply SPE). This work has had a great significance for the development in the field. While phonological theory has since moved beyond "SPE phonology" in many important respects, the SPE system is considered the precursor of some of the most influential phonological theories today, including autosegmental phonology, lexical phonology and optimality theory. Chomsky no longer publishes on phonology
The Chomskyan approach towards syntax, often termed generative grammar, studies grammar as a body of knowledge possessed by language users. Since the 1960s, Chomsky has maintained that much of this knowledge is innate, implying that children need only learn certain parochial features of their native languages. The innate body of linguistic knowledge is often termed Universal Grammar. From Chomsky's perspective, the strongest evidence for the existence of Universal Grammar is simply the fact that children successfully acquire their native languages in so little time. Furthermore, he argues that there is an enormous gap between the linguistic stimuli to which children are exposed and the rich linguistic knowledge which they attain (the "poverty of the stimulus" argument). The knowledge of Universal Grammar would serve to bridge that gap.
Chomsky's theories are popular, particularly in the United States, but they have never been free from controversy. Criticism has come from a number of different directions. Chomskyan linguists rely heavily on the intuitions of native speakers regarding which sentences of their languages are well-formed. This practice has been criticized both on general methodological grounds, and because it has (some argue) led to an overemphasis on the study of English. As of now, hundreds of different languages have received at least some attention in the generative grammar literature, but some critics nonetheless perceive this overemphasis, and a tendency to base claims about Universal Grammar on an overly small sample of languages. Some psychologists and psycholinguists,[who?] though sympathetic to Chomsky's overall program, have argued that Chomskyan linguists pay insufficient attention to experimental data from language processing, with the consequence that their theories are not psychologically plausible. Other critics (see language learning) have questioned whether it is necessary to posit Universal Grammar in order to explain child language acquisition, arguing that domain-general learning mechanisms are sufficient.
Today there are many different branches of generative grammar; one can view grammatical frameworks such as head-driven phrase structure grammar, lexical functional grammar and combinatory categorial grammar as broadly Chomskyan and generative in orientation, but with significant differences in execution.
Cultural anthropologist and linguist Daniel Everett of Illinois State University has proposed that the language of the Pirahã people of the northwestern rainforest of Brazil resists Chomsky's theories of generative grammar. Everett asserts that the Pirahã language does not have any evidence of recursion, one of the key properties of generative grammar. Additionally, it is claimed that the Pirahan have no fixed words for colors or numbers, speak in single phonemes, and often speak in prosody. However, Everett's claims have themselves been criticized. David Pesetsky of MIT, Andrew Nevins of Harvard, and Cilene Rodrigues of the Universidade Estadual de Campinas in Brazil have argued in a joint paper that all of Everett's major claims contain serious deficiencies. Chomsky himself has commented that "The reports are interesting, but do not bear on the work of mine (along with many others). No one has proposed that languages must have subordinate clauses, number words, etc. Many structures of our language (and presumably that of the Piraha) are rarely if ever used in ordinary speech because of extrinsic constraints."The dispute continues
Chomsky is famous for investigating various kinds of formal languages and whether or not they might be capable of capturing key properties of human language. His Chomsky hierarchy partitions formal grammars into classes, or groups, with increasing expressive power, i.e., each successive class can generate a broader set of formal languages than the one before. Interestingly, Chomsky argues that modeling some aspects of human language requires a more complex formal grammar (as measured by the Chomsky hierarchy) than modeling others. For example, while a regular language is powerful enough to model English morphology, it is not powerful enough to model English syntax. In addition to being relevant in linguistics, the Chomsky hierarchy has also become important in computer science (especially in compiler construction and automata theory.
Automata theory: formal languages and formal grammars
Chomsky hierarchy Grammars Languages Minimal automaton
Type-0 Unrestricted Recursively enumerable Turing machine
n/a (no common name) Recursive Decider
Type-1 Context-sensitive Context-sensitive Linear-bounded
n/a Indexed Indexed Nested stack
n/a Tree-adjoining etc. (Mildly context-sensitive) Embedded pushdown
Type-2 Context-free Context-free Nondeterministic pushdown
n/a Deterministic context-free Deterministic context-free Deterministic pushdown
Type-3 Regular Regular Finite
n/a n/a Star-free Aperiodic finite
Each category of languages or grammars is a proper subset of the category directly above it.
Any automaton in each category has an equivalent automaton in the category directly above it.