Results 191 to 200 of about 21,526 (215)
The Relevance of Tree Adjoining Grammar to Generation [PDF]
Grammatical formalisms can be viewed as neutral with respect to comprehension or generation, or they can be investigated from the point of view of their suitability for comprehension or generation. Tree Adjoining Grammars (TAG) is a formalism that factors recursion and dependencies in a special way, leading to a kind of locality and the possibility of ...
openaire +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Large Scale Semantic Construction for Tree Adjoining Grammars
2005Although Tree Adjoining Grammars (TAG) are widely used for syntactic processing, there is to date no large scale TAG available which also supports semantic construction. In this paper, we present a highly factorised way of implementing a syntax/semantic interface in TAG.
Gardent, Claire, Parmentier, Yannick
openaire +3 more sources
Parallel parsing of Tree Adjoining Grammars on the Connection Machine
International Journal of Parallel Programming, 1992Tree adjoining grammars (TAGs) were introduced by \textit{A. K. Joshi}, \textit{L. S. Levy}, \textit{M. Takahashi} (1975) mainly as a formalism for natural language specification. The expressive power of TAG formal languages proved to be situated strictly between context-free and context-sensitive languages.
Michael A. Palis, David S. L. Wei
openaire +2 more sources
On the relation between multi-depth grammars and tree adjoining grammars
1999Summary: Multi-depth grammars were introduced for describing some non-context free features of programming languages. They generate a hierarchy of languages, have intesting closure properties, are simpler than other grammar formalisms and have a very natural accepting device, called multi-pushdown automaton.
CHERUBINI, ALESSANDRA+1 more
openaire +2 more sources
Two Equivalent Regularizations for Tree Adjoining Grammars
2009We present and compare two methods of how to make derivation in a Tree Adjoining Grammar a regular process (in the Chomsky hierarchy sense) without loss of expressive power. One regularization method is based on an algebraic operation called Lifting, while the other exploits an additional spatial dimension by transforming the components of a TAG into ...
openaire +2 more sources
Large Language Models Demonstrate the Potential of Statistical Learning in Language
Cognitive Science, 2023exaly
Demonstratives, joint attention, and the emergence of grammar
Cognitive Linguistics, 2006Holger Diessel
exaly
Non‐adjacent Dependency Learning in Humans and Other Animals
Topics in Cognitive Science, 2020Benjamin Wilson+2 more
exaly