Constraint-based grammar
Part of a series on |
Linguistics |
---|
Linguistics portal |
Constraint-based grammars can perhaps be best understood in contrast to generative grammars. A generative grammar lists all the transformations, merges, movements, and deletions that can result in all well-formed sentences, while constraint-based grammars, take the opposite approach, allowing anything that is not otherwise constrained. "The grammar is nothing but a set of constraints that structures are required to satisfy in order to be considered well-formed."[1] "A constraint-based grammar is more like a data base or a knowledge representation system than it is like a collection of algorithms."[2]
Examples of such grammars include
- the non-procedural variant of Transformational Grammar of Lakoff, that formulates constraints on potential tree sequences[3]
- Johnson and Postal’s formalization of Relational Grammar (1980), GPSG in the variants developed by Gazdar et al. (1988), Blackburn et al. (1993) and Rogers (1997)[3]
- LFG in the formalization of Kaplan (1995)[3]
- HPSG in the formalization of King (1999)[3]
- Constraint handling rule grammars[4]
References
- ↑ Pollard, Carl. "The nature of constraint-based grammar" (PDF). 11th Pacific Asian conference on language, information and computation.
- ↑ Pollard, Carl. "The nature of constraint-based grammar" (PDF). 11th Pacific Asian conference on language, information and computation.
- 1 2 3 4 Müller, Stefan (2016). Grammatical theory: From transformational grammar to constraint-based approaches. Berlin: Language Science Press. pp. 490–491.
- ↑ Christiansen, Henning. "CHR Grammars with multiple constraint stores." First Workshop on Constraint Handling Rules: Selected Contributions. Universität Ulm, Fakultät für Informatik, 2004.
This article is issued from
Wikipedia.
The text is licensed under Creative Commons - Attribution - Sharealike.
Additional terms may apply for the media files.