Constraint-based grammar
Linguistics |
---|
Theoretical |
Descriptive |
Applied and experimental |
Related articles |
Linguistics portal |
Constraint-based grammars can perhaps be best understood in contrast to generative grammars. A generative grammar lists all the transformations, merges, movements, and deletions that can result in all well-formed sentences, while constraint-based grammars, take the opposite approach, allowing anything that is not otherwise constrained. "The grammar is nothing but a set of contraints that structures are required to satisfy in order to be considered well-formed."[1] "A constrain-based grammar is more like a data base or a knowledge representation system than it is like a collection of algorithms."[2]
Examples of such grammars include[3]
- the non-procedural variant of Transformational Grammar of Lakoff, that formulates constraints on potential tree sequences,
- Johnson and Postal’s formalization of Relational Grammar (1980), GPSG in the variants developed by Gazdar et al. (1988), Blackburn et al. (1993) and Rogers (1997),
- LFG in the formalization of Kaplan (1995) and
- HPSG in the formalization of King (1999).
References
- ↑ Pollard, Carl. "The nature of constraint-based grammar" (PDF). 11th Pacific Asian conference on language, information and computation.
- ↑ Pollard, Carl. "The nature of constraint-based grammar" (PDF). 11th Pacific Asian conference on language, information and computation.
- ↑ Müller, Stefan (2016). Grammatical theory: From transformational grammar to constraint-based approaches. Berlin: Language Science Press. pp. 490–491.
This article is issued from Wikipedia - version of the 8/3/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.