Generative grammar is a theoretical approach in linguistics that regards grammar as a domain-specific system of rules that generates all and only the grammatical sentences of a given language. In light of poverty of the stimulus arguments, grammar is regarded as being partly innate, the innate portion of the system being referred to as universal grammar. The generative approach has focused on the study of syntax while addressing other aspects of language including semantics, morphology, phonology, and psycholinguistics.[1][2]

A generative syntax tree in which the sentence S breaks down into a noun phrase NP and a verb phrase VP, both of which break down into additional smaller constituents.

As a research tradition, generative grammar began in the late 1950s with the work of Noam Chomsky.[3] However, its roots include earlier structuralist approaches such as glossematics.[4] Early versions of Chomsky's approach to syntax were called transformational grammar, with subsequent variants known as the government and binding theory and the minimalist program.[5][6] Recent work in generative-inspired biolinguistics has proposed that universal grammar consists solely of syntactic recursion, and that it arose recently in humans as the result of a random genetic mutation.[7]

Frameworks edit

There are a number of different approaches to generative grammar. Common to all is the effort to come up with a set of rules or principles that formally defines every one of the members of the set of well-formed expressions of a natural language. The term generative grammar has been associated with at least the following schools of linguistics:

Historical development of models of transformational grammar edit

Leonard Bloomfield, an influential linguist in the American Structuralist tradition, saw the ancient Indian grammarian Pāṇini as an antecedent of structuralism.[8][9] However, in Aspects of the Theory of Syntax, Chomsky writes that "even Panini's grammar can be interpreted as a fragment of such a 'generative grammar'",[10] a view that he reiterated in an award acceptance speech delivered in India in 2001, where he claimed that "the first 'generative grammar' in something like the modern sense is Panini's grammar of Sanskrit".[11]

Military funding to generativist research was influential to its early success in the 1960s.[12]

Generative grammar has been under development since the mid-1950s, and has undergone many changes in the types of rules and representations that are used to predict grammaticality. In tracing the historical development of ideas within generative grammar, it is useful to refer to the various stages in the development of the theory:

Standard theory (1956–1965) edit

The so-called standard theory corresponds to the original model of generative grammar laid out by Chomsky in 1965.

A core aspect of standard theory is the distinction between two different representations of a sentence, called deep structure and surface structure. The two representations are linked to each other by transformational grammar.

Extended standard theory (1965–1973) edit

The so-called extended standard theory was formulated in the late 1960s and early 1970s. Features are:

  • syntactic constraints
  • generalized phrase structures (X-bar theory)

Revised extended standard theory (1973–1976) edit

The so-called revised extended standard theory was formulated between 1973 and 1976. It contains

Relational grammar (ca. 1975–1990) edit

An alternative model of syntax based on the idea that notions like subject, direct object, and indirect object play a primary role in grammar.

Government and binding/principles and parameters theory (1981–1990) edit

Chomsky's Lectures on Government and Binding (1981) and Barriers (1986).

Minimalist program (1990–present) edit

The minimalist program is a line of inquiry that hypothesizes that the human language faculty is optimal, containing only what is necessary to meet humans' physical and communicative needs, and seeks to identify the necessary properties of such a system. It was proposed by Chomsky in 1993.[13]

Context-free grammars edit

Generative grammars can be described and compared with the aid of the Chomsky hierarchy (proposed by Chomsky in the 1950s). This sets out a series of types of formal grammars with increasing expressive power. Among the simplest types are the regular grammars (type 3); Chomsky argues that these are not adequate as models for human language, because of the allowance of the center-embedding of strings within strings, in all natural human languages.

At a higher level of complexity are the context-free grammars (type 2). The derivation of a sentence by such a grammar can be depicted as a derivation tree. Linguists working within generative grammar often view such trees as a primary object of study. According to this view, a sentence is not merely a string of words. Instead, adjacent words are combined into constituents, which can then be further combined with other words or constituents to create a hierarchical tree-structure.

The derivation of a simple tree-structure for the sentence "the dog ate the bone" proceeds as follows. The determiner the and noun dog combine to create the noun phrase the dog. A second noun phrase the bone is created with determiner the and noun bone. The verb ate combines with the second noun phrase, the bone, to create the verb phrase ate the bone. Finally, the first noun phrase, the dog, combines with the verb phrase, ate the bone, to complete the sentence: the dog ate the bone. The following tree diagram illustrates this derivation and the resulting structure:

 

Such a tree diagram is also called a phrase marker. They can be represented more conveniently in text form, (though the result is less easy to read); in this format the above sentence would be rendered as:
[S [NP [D The ] [N dog ] ] [VP [V ate ] [NP [D the ] [N bone ] ] ] ]

Chomsky has argued that phrase structure grammars are also inadequate for describing natural languages, and formulated the more complex system of transformational grammar.[14]

Evidentiality edit

Some linguists such as Geoffrey Pullum have questioned the empirical basis of poverty of the stimulus arguments, which motivate the crucial generative notion of universal grammar.[15] Linguistic studies had been made to prove that children have innate knowledge of grammar that they could not have learned. For example, it was shown that a child acquiring English knows how to differentiate between the place of the verb in main clauses from the place of the verb in relative clauses. In the experiment, children were asked to turn a declarative sentence with a relative clause into an interrogative sentence. Against the expectations of the researchers, the children did not move the verb in the relative clause to its sentence initial position, but to the main clause initial position, as is grammatical.[16] Critics however pointed out that this was not an evidence for the poverty of the stimulus because the underlying structures that children were proved to be able to manipulate were actually highly common in children's literature and everyday language.[15] This led to a heated debate which resulted in an increasing split between generative linguists and applied linguistics in the early 2000s.[17][18][18]

 
The sentence from the study which shows that it is not the verb in the relative clause, but the verb in the main clause that raises to the head C°.[19]

Recent arguments have been made that the success of large language models undermine key claims of generative syntax because they are based on markedly different assumptions, including gradient probability and memorized constructions, and out-perform generative theories both in syntactic structure and in integration with cognition and neuroscience.[20]

Generative-inspired biolinguistics has not uncovered any particular genes responsible for language. While some hopes were raised at the discovery of the FOXP2 gene,[21][22] there is not enough support for the idea that it is 'the grammar gene' or that it had much to do with the relatively recent emergence of syntactical speech.[23]

Generativists also claim that language is placed inside its own mind module and that there is no interaction between first-language processing and other types of information processing, such as mathematics.[24][a] This claim is not based on research or the general scientific understanding of how the brain works.[25][26]

Music edit

Generative grammar has been used in music theory and analysis since the 1980s.[27][28] The most well-known approaches were developed by Mark Steedman[29] as well as Fred Lerdahl and Ray Jackendoff,[30] who formalized and extended ideas from Schenkerian analysis.[31] More recently, such early generative approaches to music were further developed and extended by various scholars.[32] [33][34][35][36]

See also edit

Notes edit

  1. ^ Smith 2002, p. 17 "the mind itself is not an undifferentiated general-purpose machine: it is compartmentalized in such a way that different tasks are subserved by different mechanisms. The mind is "modular". Sight and smell, taste and touch, language and memory, are all distinct from each other, from our moral and social judgment, and from our expertise in music or mathematics."

References edit

  1. ^ Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Wiley Blackwell.
  2. ^ Carnie, Andrew (2002). Syntax: A Generative Introduction. Wiley-Blackwell. pp. 3–20. ISBN 978-0-631-22543-0.
  3. ^ "Tool Module: Chomsky's Universal Grammar". thebrain.mcgill.ca. Retrieved 2017-08-28.
  4. ^ Koerner, E. F. K. (1978). "Towards a historiography of linguistics". Toward a Historiography of Linguistics: Selected Essays. John Benjamins. pp. 21–54.
  5. ^ "Tool Module: Chomsky's Universal Grammar". thebrain.mcgill.ca. Retrieved 2017-08-28.
  6. ^ "Mod 4 Lesson 4.2.3 Generative-Transformational Grammar Theory". www2.leeward.hawaii.edu. Retrieved 2017-02-02.
  7. ^ Berwick, Robert C.; Chomsky, Noam (2015). Why Only Us: Language and Evolution. MIT Press. ISBN 9780262034241.
  8. ^ Bloomfield, Leonard, 1929, 274; cited in Rogers, David, 1987, 88
  9. ^ Hockett, Charles, 1987, 41
  10. ^ Chomsky, Noam (2015). Aspects of the theory of syntax. The MIT Press. pp. v. ISBN 978-0-262-52740-8. OCLC 1055331632.
  11. ^ "Understanding human language". frontline.thehindu.com. 7 December 2001. Retrieved 24 July 2022.
  12. ^ Newmeyer, F. J. (1986). Has there been a 'Chomskyan revolution' in linguistics?. Language, 62(1), p.13
  13. ^ Chomsky, Noam. 1993. A minimalist program for linguistic theory. MIT occasional papers in linguistics no. 1. Cambridge, Massachusetts: Distributed by MIT Working Papers in Linguistics.
  14. ^ Chomsky, Noam (1956). "Three models for the description of language" (PDF). IRE Transactions on Information Theory. 2 (3): 113–124. doi:10.1109/TIT.1956.1056813. S2CID 19519474. Archived from the original (PDF) on 2010-09-19.
  15. ^ a b Pullum, GK; Scholz, BC (2002). "Empirical assessment of stimulus poverty arguments" (PDF). The Linguistic Review. 18 (1–2): 9–50. doi:10.1515/tlir.19.1-2.9. S2CID 143735248. Retrieved 2020-02-28.
  16. ^ Pinker, Steven (2007). The language instinct: The new science of language and mind. Harper Perennial Modern Classics. ISBN 9780061336461.
  17. ^ Fernald, Anne; Marchman, Virginia A. (2006). "27: Language learning in infancy". In Traxler and Gernsbacher (ed.). Handbook of Psycholinguistics. Academic Press. pp. 1027–1071. ISBN 9780080466415.
  18. ^ a b de Bot, Kees (2015). A History of Applied Linguistics: From 1980 to the Present. Routledge. ISBN 9781138820654.
  19. ^ Christensen, Christian Hejlesen. "Arguments for and against the Idea of Universal Grammar". Leviathan: Interdisciplinary Journal in English, 2018: 12–28.
  20. ^ Piantadosi, S (2023). "Modern Language Models Refute Chomsky's Approach to Language". Lingbuzz. Retrieved 2023-03-15.
  21. ^ Scharff C, Haesler S (December 2005). "An evolutionary perspective on FoxP2: strictly for the birds?". Curr. Opin. Neurobiol. 15 (6): 694–703. doi:10.1016/j.conb.2005.10.004. PMID 16266802. S2CID 11350165.
  22. ^ Scharff C, Petri J (July 2011). "Evo-devo, deep homology and FoxP2: implications for the evolution of speech and language". Philos. Trans. R. Soc. Lond. B Biol. Sci. 366 (1574): 2124–40. doi:10.1098/rstb.2011.0001. PMC 3130369. PMID 21690130.
  23. ^ Diller, Karl C.; Cann, Rebecca L. (2009). Rudolf Botha; Chris Knight (eds.). Evidence Against a Genetic-Based Revolution in Language 50,000 Years Ago. Oxford Series in the Evolution of Language. Oxford.: Oxford University Press. pp. 135–149. ISBN 978-0-19-954586-5. OCLC 804498749. {{cite book}}: |work= ignored (help)
  24. ^ Smith, Neil (2002). Chomsky: Ideas and Ideals (2nd ed.). Cambridge University Press. ISBN 0-521-47517-1.
  25. ^ Schwarz-Friesel, Monika (2012). "On the status of external evidence in the theories of cognitive linguistics". Language Sciences. 34 (6): 656–664. doi:10.1016/j.langsci.2012.04.007.
  26. ^ Elsabbagh, Mayada; Karmiloff-Smith, Annette (2005). "Modularity of mind and language". In Brown, Keith (ed.). Encyclopedia of Language and Linguistics (PDF). Elsevier. ISBN 9780080547848. Retrieved 2020-03-05.
  27. ^ Baroni, M., Maguire, S., and Drabkin, W. (1983). The Concept of Musical Grammar. Music Analysis, 2:175–208.
  28. ^ Baroni, M. and Callegari, L. (1982) Eds., Musical grammars and computer analysis. Leo S. Olschki Editore: Firenze, 201–218.
  29. ^ Steedman, M.J. (1989). "A Generative Grammar for Jazz Chord Sequences". Music Perception. 2 (1): 52–77. doi:10.2307/40285282. JSTOR 40285282.
  30. ^ Lerdahl, Fred; Ray Jackendoff (1996). A Generative Theory of Tonal Music. Cambridge: MIT Press. ISBN 978-0-262-62107-6.
  31. ^ Heinrich Schenker, Free Composition. (Der Freie Satz) translated and edited by Ernst Ostler. New York: Longman, 1979.
  32. ^ Tangian, Andranik (1999). "Towards a generative theory of interpretation for performance modeling". Musicae Scientiae. 3 (2): 237–267. doi:10.1177/102986499900300205. S2CID 145716284.
  33. ^ Tojo, O. Y. & Nishida, M. (2006). Analysis of chord progression by HPSG. In Proceedings of the 24th IASTED international conference on Artificial intelligence and applications, 305–310.
  34. ^ Rohrmeier, Martin (2007). A generative grammar approach to diatonic harmonic structure. In Spyridis, Georgaki, Kouroupetroglou, Anagnostopoulou (Eds.), Proceedings of the 4th Sound and Music Computing Conference, 97–100. http://smc07.uoa.gr/SMC07%20Proceedings/SMC07%20Paper%2015.pdf
  35. ^ Giblin, Iain (2008). Music and the generative enterprise. Doctoral dissertation. University of New South Wales.
  36. ^ Katz, Jonah; David Pesetsky (2009) "The Identity Thesis for Language and Music". http://ling.auf.net/lingBuzz/000959

Further reading edit