Talk:Transformational grammar

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
WikiProject Linguistics / Theoretical Linguistics  (Rated B-class, Mid-importance)
WikiProject iconThis article is within the scope of WikiProject Linguistics, a collaborative effort to improve the coverage of linguistics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
B-Class article B  This article has been rated as B-Class on the project's quality scale.
 Mid  This article has been rated as Mid-importance on the project's importance scale.
Taskforce icon
This article is supported by Theoretical Linguistics Task Force.
 


OK, I've added a brief description of what a transformational grammar is, and a history of the development of these grammars from a Chomskyan point of view. It probably concentrates too much on Chomsky and DS/SS/LF/PF, but this just happens to be what I know something about. This really needs to be merged with the Transformational-generative grammar page. -- Cadr

I've merged the article. The TGG page was really just a bunch of transformations, so I tagged it on to the end of this one. Dduck 17:52, 16 Nov 2003 (UTC)


Thanks :) -- Cadr
I'm thinking about what to do with the "transformations" section. It's pretty good as it is, but it's rather out of synch with current thinking. On the other hand it should give simple examples, and it might be hard to bring in line with more current ideas in syntax without complicating it, so I dunno...My main problem with it is that it talks about particular rules (e.g. question forming rules) in the kind of way Chomsky would have talked about them 30-40 years ago. Now there are no construction-specific rules, so it's a little misleading. Anyone have any ideas? -- Cadr

The article says Chomsky argued that the intuition of a native speaker is enough to define the grammaticalness of a sentence; that is, if a native English speaker finds it difficult or impossible to understand a particular string of English words, it can be said that the string of words is ungrammatical [1]. ... [1] This is not entirely true; it is possible for a sentence to be both grammatical and meaningless, as in Chomsky's famous example "colourless green ideas sleep furiously". Such sentences are nonsensical in a very different way to (non-)sentences like "man the bit dog the"

Not only is the footnote right that the text in the body is not entirely true, the text in the body is actually wrong. Sensicality and grammaticality are nearly completely seperate. The footnote gives examples of both a nonsenical but grammatical sentence and a (more or less) sensical but ungrammatical sentence. Both types of example are incredibly easy to come up with in scores, because these attributes are not related.

It is the intuition of a native speaker which defines grammaticality, but their intuition of grammaticality, not meaning.

But I was reluctant to edit the body of the text because, someone else had already been reluctant to correct it, thus the footnote, plus I've been out of the Wikipedia loop for a while.

So if someone would like to a) refute my claim that the aritcle is wrong, b) fix the article themselves, or c) suggest whether I should correct the body of the article or just expand the footnote, I would appreciate it.

Aidan Elliott-McCrea 17:03, 26 Feb 2004 (UTC)
In fact I wrote the text and the footnote -- I agree that the text is misleading, but I was trying to simplify (hence the correcting footnote). But I take your point; please edit it as you see fit :) -- Cadr
Done. :Aidan

[This article concentrates heavily on Chomsky and Chomsky-related aspects of this topic. This is justifiable to some degree considering his importance in the field, but it would be nice to have a more balanced view.]

I removed the above from the text, because it's an editorial note, and even if true, belongs on the talk page. DanKeshet 23:43, May 10, 2004 (UTC)

Transformational Grammar[edit]

According to what I've read above, some maintainers of this page think what I'm about to suggest would complicate the issue, but the article says "the mechanisms described in the example above have been out of date since the late 1960s", and I would really like to know what the current theory is to explain the transformation from "He went there" to "Where did he go?" Maybe this query belongs here.

The link at the end of the page gives a good introduction to (fairly) modern transformational theory. As can be seen by the length of it, it's not really feasible to go into the detail of the theory in an encyclopaedia article. Cadr
I would like to give a detailed answer to this eventually, but for the moment I don't have a lot of time. I refer you to the article Lexical Functional Grammar, one of the "current" theories. The essential thing here is that current syntactical theory rejects that "deep structure" is a tree-structured sentence. For instance, while chomskyan syntax maps "Where did he go" to "He did go where?", LFG maps "Where did he go" to an attribute-value matrix. arj 20:42, 16 May 2004 (UTC)

LFG is nontransformational (as you explain), so it isn't really an example of current transformational theory. But I do like LFG — just being pedantic ;) The current(ish) transformational analysis of questions isn't actually all that different from the one given in the article. You start with:

[CP [Spec 0] [C 0] [IP John [I did] [VP hit [DP who]]]] (partial structure only)

Then move I to C (subject auxilluary inversion):

[CP [Spec 0] [C did] [IP John t [VP hit [DP who]]]]

Then move "who" to the front of the sentence (Spec-CP):

[CP [Spec [DP who] [C did] [IP John t [VP hit t]]]

(I've used '0' to represent an empty node in the tree.) Subject-auxilluary inversion is justified by saying that C has a +Q (question) feature which needs to be checked by the dummy auxilliary "did" (after all, you wouldn't have that auxilliary in a non-question sentence, so it must be doing something). Movement of "who" is harder to explain. Basically, it gives you a representation with a quantifier and a variable, like in logic:

for which X, John hit X

Cadr


Hi All,

I wasn't really happy at all with the section on transformations, although the earlier version was fairly straightforward, it contained some pretty big inaccuracies. For example it listed headedness parameters as examples of transformations. These aren't transformations at all. Transformations as Chomsky designed them were structure changing and structure building operations, not settings of parameters. Also the example of wh-movement was never proposed in that particular formulation. I'm afraid I've made the section a little harder to read and a little more technical, but much more accurate. Further examples would probably make it clearer.

AndrewCarnie


Hi all, I am not sure that the last discussion on this page (Revision as of 18:33, 24 Jun 2004 137.194.204.100: a discussion of Vygotsky's work in the middle of the section on minimalism) belongs where it has been put. Does anybody else agree with reverting to the previous version? I think this may have been a confusion on the part of the contributer (I've never heard such a direct link between ("Chomskyan") minimalism and Vygotsky before and don't think that this link should be present on this section on minimalism. AnandaLima 04:35, 1 Jul 2004 (UTC)

Hi, For clarity at the beginning of this article (and in keeping with the general uses of Wikipedia), I think a more simple, straightforward definition is required. This sentence from a lower section of the article would do, with minor alterations: "One of the most important of Chomsky's ideas is that most of this knowledge is innate, with the result that a baby can have a large body of prior knowledge about the structure of language in general, and need only actually learn the idiosyncratic features of the language(s) it is exposed to."

Additionally, there are short and straightforward definitions/discussion to be found in textbooks, such as How English Works: A Linguistic Introduction.

In general, making this discussion even more exhaustive defeats the purpose of Wikipedia as a beginning source or gateway to other research. I realize it is an ill-defined genre still, but its uses should be kept in mind as the genre continues to define itself. Therefore, the kind of language appropriate to textbooks is also appropriate to Wikipedia, i.e., introductory language aimed at novices, particularly in the earliest section of the articles.

(Non)Context-freeness of natural language[edit]

I strongly disagree with this sentence:

It is now generally accepted that it is impossible to describe the structure of natural languages using context free grammars (at least if these descriptions are to be judged on vaguely Chomskyan criteria).

Can anyone point me to sources for this extraordinary claim? Thanks, Burschik 11:55, 17 Aug 2004 (UTC)

It's certainly not extraordinary, it's just an element of Chomskyan orthodoxy which has gone relatively unchallenged. Even advocates of GPSG only (?) succeeded in using CFGs to describe natural languages by using metagrammars, complex feature systems and sophisticated semamtic rules, i.e. extensions to basic context free grammar. The empirical argument supporting the claim is very simple. Natural languages allow unbounded dependencies (e.g. in the sentence "Which man whose brother John used to go to school with likes muffins?", where the verb "likes" must agree in number with the phrase "which man") and CFGs (quite uncontroversialy) cannot by themselves deal with unbounded dependencies.
In point of fact, a CFG can handle an arbitrary number of dependencies of that sort. It's dependencies of the sort R1 S1 R2 S2 that a CFG can't handle, and in natural language, all similar examples dealing with semantic references that I'm familiar with are ambiguous. However, operator movement might alter this principle. Dhasenan 19:24, 21 May 2006 (UTC)
It depends what you mean. A CFG can't handle long-distance dependencies and agreement at the same time in a linguistically well-motivated way (because you have to have separate rules establishing verb/subject agreement for sentences with and without wh-movment). In order to get rid of this sort of redundancy, you need to use metarules (a la GPSG), but then you no longer have a CFG, just something with the same weak generative capacity. (You can expand the metarules out and get a CFG, of course, but the resulting CFG won't be linguistically plausible for the reason just given.)
Anyway, it's now completely uncontroversial that CFGs aren't adequate, because people have found natural language constructions which can't even be weakly generated by CFGs. (I will add a reference to this effect shortly). Cadr 01:53, 22 May 2006 (UTC)
The non-context-freeness of NL is now quite uncontroversial, although I'm not convinced that a recursively-enumerable grammar is required (which TG is). Despite some small disagreements by Manaster Ramer (1988), the Swiss German arguments of non-context-freeness of NL by R. Huybregts (1984) and Shieber (1985) have largely gone uncontested. Further papers by Culy (1985) regarding Bambara and Phillip Miller (1991) regarding Norwegian and Swedish have also given creedence to the argument. In spite of these few non-context-free examples (which nevertheless are very important), I like the quote by Gazdar and Pullum (1985) on the matter: "the overwhelming majority of the structures of any NL can be elegantly and efficiently parsed using context-free parsing technologies." The whole topic of the relation between formal languages and natural languages is fascinating and I think deserves much more treatment by linguists than is currently given. --jonsafari 05:20, 22 May 2006 (UTC)
Although traditional TG is, indeed, re (as pointed out above), it's pretty universally accepted now that those features natural language which are non-context-free (i.e. cross-serial dependencies) only require something which is a bit weaker than a re grammar, namely a "mildly context sensitive" grammar. Interestingly, it turns out that nearly all contemporary linguistic formalisms (Minimalism (formalized à la Ed Stabler), Categorial Grammar, Tree-Adjoining Grammar, the various PSGs) are mildly context sensitive. Diakronik (talk) 01:56, 16 March 2009 (UTC)
Having said this, the efforts of GPSG and related theories probably mean that we should weaken the statement a little, perhaps to "widely agreed"? Cadr 18:22, 17 Aug 2004 (UTC)
You could go back to Syntactic Structures or Aspects of a Theory of Syntax. I don't really understand how this can be controversial. Here is a simple example (which Chomsky used in a class he taught last fall at University of Arizona): "Mary looked at the man with the telescope" This has two alternative possible ways to parse the sentence: Mary used a telescope to look at the man or Mary looked at the man who had a telescope. Clearly you can't distinguish which parse is correct simply from the sentence. You need the context of the discourse in which the sentence occurred. So for example: "Mary just got a new telescope. Mary looked at the man with the telescope" or "Mary thought that the guy with the telescope was weird. Mary looked at the man with the telescope." --MadScientistX11 (talk) 22:46, 7 February 2019 (UTC)

Wikibook about transformational grammar[edit]

Hello, I am a Spanish wikipedian who has begun a stub in Wikibooks about the rules governing the language according to the Transformational grammar. All of you are welcome to participate on it. Thank you :) --Javier Carro 11:18, 1 Jan 2005 (UTC)

Citing sources[edit]

I've moved this tag from the article: {{unreferenced}} The article could do with more citations, but I think tags like this belong on the talk page rather than article space unless there is a good reason, and I don't think there's a good enough reason here. Enchanter 23:08, 17 January 2006 (UTC)

I don't see any references cited for the article. Unless you intend to start providing references very soon, I intend to move the tag back to the article page. Placing the tag on the talk page is, in my opinion, hiding it away so no one will notice it. -- Dalbury(Talk) 23:30, 17 January 2006 (UTC)
Wikipedia articles are there for the benefit of readers; the talk page is there for contributors to discuss what should be in the article. That's why Wikipedia generally avoids putting comment, discussion and tags in articles unless necessary. I can't see any benefit to the reader of the article of this tag; for them, it is a statement of the obvious (it's clear to anyone reading what is and isn't cited). We shouldn't be writing "adverts" in the articles to get more contributors - that's just not appropriate in an encyclopedia article. If we applied tags to every article that could do with some kind of improvement, most of our articles would be covered with them. Also, just using a tag doesn't really help contributors much either; for example, it's not clear what specific aspects of the article you would like to see cited. That's why this kind of material belongs on the talk page unless there is a good reason why the comment in the article would benefit readers. Enchanter 23:46, 17 January 2006 (UTC)
We place all kinds of tags on article pages: {{Unreferencedsect}} (specifically intended to go in a section on an article page), {{afd}}, {{ActiveDiscuss}}, {{Contradict}}, {{Contradict-other}}, {{Controversial}}, {{Disputed}}, {{Not verified}}, {{dubious}}, {{Hoax}}, {{POV}}, {{copyvio}}, {{expert}}, and many others, all of which serve as warnings to readers that they cannot necessarily rely on the article as a source of information. I think that {{unreferenced}} is quite mild compared to some of those. We need to be upfront about the deficiencies of articles. And putting those notices on the article pages is a goad to improve them. As for what needs to be sourced, everything in an article should have a source cited at some level. Wikipedia:Verifiability is quite clear that everything that goes into an article must be verifiable, so editors should cite credible sources so that their edits can be verified by readers and other editors. If the article does not cite credible sources, then the article lacks credibility. If the articles are not credible, then Wikipedia is not credible. I'm reluctant to do this myself. I have a very long list of things I want to do in Wikipeia, and it has been 30 years since I've studied transformational grammar. If someone whose acquaintance with TG is more recent than mine wants to work on supplying references, I'll try to help. I do have a number of references in the house, but, as I said, I haven't opened them in 30 years or more. (Gah, I've just talked myself into another commitment!) -- Dalbury(Talk) 00:33, 18 January 2006 (UTC)
The other option would be to move the uncited material to the talk page. - FrancisTyers 00:39, 18 January 2006 (UTC)
That would be the whole article, right now. I just want to see some references cited. -- Dalbury(Talk) 02:09, 18 January 2006 (UTC)

Does this qualify as an example of a speaker with a transformational grammar ?[edit]

Some researchers have found a Parrot with 950 word vocabulary, who can generate new words. --Ancheta Wis 03:12, 29 December 2006 (UTC)

Not enough data in that article to say. Claims such at this tend to become very thin and trivial when examined closely. -- Donald Albury 20:01, 29 December 2006 (UTC)

Split infinitive[edit]

The article on the split infinitive has a stub section claiming that there is a transformational grammar approach to understanding this construction. Could someone who has worked on this aspect of linguistics please go there and write a paragraph explaining this? Thanks, --Doric Loon 11:51, 27 August 2007 (UTC)

The central question[edit]

What is transformational grammar? The answer to this central question does not lie in the details of its properties (although fine that you include them to provide a more detailed understanding). What problem does it attempt to solve, or issue does it attempt to address? What is it for? What good is it in practice? What is transformational grammar? —Preceding unsigned comment added by Rogerfgay (talkcontribs) 15:34, 16 October 2007 (UTC)

Tranformational grammar is a context free grammar plus rules that allow you to take one parse tree and form another according to well defined "pivoting rules". These rules preserve the meaning of the sentence, but change the form. According to Chomsky, natural language should be defined by both its grammar and transformations.Likebox (talk) 04:29, 12 January 2010 (UTC)

Grammatical Obtuseness Needing Transformation[edit]

This is the lead sentence in the article: "In linguistics, a transformational grammar or transformational-generative grammar (TGG) is a generative grammar, especially of a natural language, that has been developed in a Chomskyan tradition." The immediate question which arises is, what is the antecedent of the pronoun that? The sentence should be transformed so as to be clear. Is this what was meant:

"grammar (especially of a natural language) that has been developed in a Chomskyan tradition"? Is the antecedent of 'that' the word grammar (not language)? (EnochBethany (talk) 21:43, 12 December 2010 (UTC))

The introductory paragraph contains this sentence: "Additionally, transformational grammar is the tradition that gives rise to specific transformational grammars." That seems awkward. What does that mean? Gerntrash (talk) 16:25, 12 March 2013 (UTC)

English is most definitely Context Free[edit]

Did Chomsky really say that context free grammars are inadequate for English? That's total nonsense. I looked up the paper, and what he actually says is that certain constructions are not captured unambiguously by just looking at what type of nodes occur on the tree, so that you need to consider what they would transform into if you do a generative transformation. That's reasonable, but is not more than context free. More than context free would be an overlapping construction like this:

John MARY went to the LOOKED OUT store THE WINDOW.

Which is not allowed in nearly any language designed by humans, natural or artificial.Likebox (talk) 04:29, 12 January 2010 (UTC)

@likebox, you'll want to look at the work on Swiss German and Dutch that claims that both languages have crossing dependencies, and are, in fact, context sensitive. A good place to start is: S.M. Shieber. Evidence against the context-freeness of natural language. Linguistics and Philosophy, 8:333-343, 1985. Comhreir (talk) 06:35, 12 January 2010 (UTC)
If by "context sensitive" you mean that some sentences require looking at some non-local relationships on the whole parse tree to determine subject-verb agreement or the relation between a verb and an object, then this is clearly true. But that's a pretty miserable kind of context sensitivity. "English is context free" means that the basic sentence structure consists of non-overlapping clauses, so that you can always find a parse tree for a sentence and the clauses do not overlap.
To appreciate that this is a very special situation, you have consider a true non context free language. Since no examples occur naturally, I will make one up, just to show you how artificial it is. Consider the following construction:
John walked to the store is green is my favorite color.
Meaning "John walked to the store", "the store is green", and "green is my favorite color". Such a monstrosity puts together an overlapping clause structure, which cannot be parenthesized into clauses:
(John walked to [the store) is (green] is my favorite color)
where to prevent confusion, I put two kinds of parentheses to indicate the ovelapping constructions. This sort of thing is logically possible in non-context free artificial languages, but it is not allowed in context free grammars, and it is also not allowed in natural language. There is no exception to this rule that I know of. That means that basic English grammar is context free, except that not all agreement rules or verb-object relationships are determined by the context-free grammar alone.
If you know an exception to this rule, I would be happy to hear it. Reading linguistics texts, I find that they are often too close to natural language to appreciate that overlapping clauses are logically possible. Some papers do emphasize the the basic rules of English grammar are described by BNF.Likebox (talk) 17:02, 12 January 2010 (UTC)
Again, read Shreiber's paper. It has examples exactly like the one you are talking about.Comhreir (talk) 17:15, 12 January 2010 (UTC)
It doesn't have any such examples, because these examples don't exist. I call your bluff. Show me one example. Lift it out of Schreiber if you want.Likebox (talk) 03:27, 13 January 2010 (UTC)

Ok--- I read Schreiber. His examples are in Swiss-German, which allows for verbs to come at the end of a sentence. Then he gives this sentence:

I John(1) the house(2) helped(1) paint(2)

Which attached a list of objects (John house), to a list of verbs (help paint) in forward order, not in revere order. This is not context free, as can be easily proved by extending it to arbitrary strings of objects:

I Alice(1) Bob(2) Carol(3) Dana(4) Ernest(5) John(6) the house(7) told(1) told(2) told(3) told(4) told(5) help(6) paint(7).

It is obvious that as the number of characters extends, the object of the first "told" is arbitrarily deep down in the stack when it is encountered. So this is in fact not context free. But it is unique to German.

It is possible to add a non-context free attachment rule like this, while preserving most of the non-overlapping structure of the clauses. You could do this by making up the single non-context free rule that verbs coming one after the other produce compound verbs whose arguments are the union of the arguments of the verbs in order. With this rule, the non-context freeness of Swiss German would be isolated in one construction.

The question still remaains of whether verbs can bind to distant objects in English. I thought of a few poetic sounding sentences:

By the sea is where Jack said to Clara to say to Fred to say to Amy that they should meet.

Which could be legitimately interpreted that Jack was thinking that the meeting place should be by the sea, rather than that Jack was by the sea when he said it. This type of thing can be nested:

Amy asked Jack if by the Caspian Sea is where he said they should meet.

So perhaps there are some distant object-verb attachments which can happen. But the basic non-overlapping clause rule is still correct, and that is still much of the essence of context-freeness. Perhaps the absolutely correct statement is "strictly nested clauses", which might be a little weaker condition than context-free, which also includes the condition that attachements have to be local.Likebox (talk) 05:11, 13 January 2010 (UTC)

But honestly--- I can't think of a natural (not overly poetic) English example which is not context free.Likebox (talk) 05:59, 13 January 2010 (UTC)

External links modified[edit]

Hello fellow Wikipedians,

I have just modified one external link on Transformational grammar. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

As of February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete the "External links modified" sections if they want, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{sourcecheck}} (last update: 15 July 2018).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.


Cheers.—InternetArchiveBot (Report bug) 23:04, 5 December 2017 (UTC)

Comments[edit]

This article doesn't mention that this theory has basically been disproved by studies since it's been written and is basically only useful for academics studying Chomsky specifically. Neuroelectronic (talk) 04:16, 18 December 2018 (UTC)

You can add that yourself, citing reliable sources that say that. Without reliable sources that say that the theory has been disproven, we can't say that. - Donald Albury 17:25, 18 December 2018 (UTC)