Constraint-based grammar
Jump to navigation
Jump to search
Part of a series on |
Linguistics |
---|
Linguistics portal |
Constraint-based grammars can perhaps be best understood in contrast to generative grammars. Whereas a generative grammar lists all the transformations, merges, movements, and deletions that can result in all well-formed sentences, constraint-based grammars take the opposite approach: allowing anything that is not otherwise constrained.
"The grammar is nothing but a set of constraints that structures are required to satisfy in order to be considered well-formed."[1] "A constraint-based grammar is more like a data base or a knowledge representation system than it is like a collection of algorithms."[2]
Examples of such grammars include
- the non-procedural variant of Transformational Grammar of Lakoff, that formulates constraints on potential tree sequences[3]
- Johnson and Postal’s formalization of Relational Grammar (1980), Generalized Phrase Structure Grammar in the variants developed by Gazdar et al. (1988), Blackburn et al. (1993) and Rogers (1997)[3]
- LFG in the formalization of Kaplan (1995)[3]
- HPSG in the formalization of King (1999)[3]
- Constraint handling rule grammars[4]
References[edit]
- ^ Pollard, Carl. "The nature of constraint-based grammar" (PDF). 11th Pacific Asian conference on language, information and computation.
- ^ Pollard, Carl. "The nature of constraint-based grammar" (PDF). 11th Pacific Asian conference on language, information and computation.
- ^ a b c d Müller, Stefan (2016). Grammatical theory: From transformational grammar to constraint-based approaches. Berlin: Language Science Press. pp. 490–491.
- ^ Christiansen, Henning. "CHR Grammars with multiple constraint stores." First Workshop on Constraint Handling Rules: Selected Contributions. Universität Ulm, Fakultät für Informatik, 2004.
This grammar-related article is a stub. You can help Wikipedia by expanding it. |