Results 311 to 320 of about 7,076,398 (377)

Eco: A Language Composition Editor [PDF]

open access: possibleSoftware Language Engineering, 2014
Language composition editors have traditionally fallen into two extremes: traditional parsing, which is inflexible or ambiguous; or syntax directed editing, which programmers dislike. In this paper we extend an incremental parser to create an approach which bridges the two extremes: our prototype editor ‘feels’ like a normal text editor, but the user ...
Lukas Diekmann, L. Tratt
semanticscholar   +3 more sources

Composition of Languages, Models, and Analyses

2021
This chapter targets a better understanding of the compositionality of analyses, including different forms of compositionality and specific conditions of composition. Analysis involves models, contexts, and properties. These are all expressed in languages with their own semantics.
Talcott C   +12 more
openaire   +3 more sources

Language extension and composition with language workbenches

Proceedings of the ACM international conference companion on Object oriented programming systems languages and applications companion, 2010
Domain-specific languages (DSLs) provide high expressive power focused on a particular problem domain. They provide linguistic abstractions and specialized syntax specifically designed for a domain, allowing developers to avoid boilerplate code and low-level implementation details.Language workbenches are tools that integrate all aspects of the ...
Markus Völter, Eelco Visser
openaire   +2 more sources

On mutual underivability of compositions in the minimal composition language MCL [PDF]

open access: possibleCybernetics, 1990
Mutual underivability of compositions in the language MCL is considered. The main focus is on constraints that should be imposed on the set of named data. The compositions are shown to be mutually underivable when the set of named data is regular.
V. V. Pogosyan, D. B. Bui
openaire   +2 more sources

Function Vectors in Large Language Models

International Conference on Learning Representations, 2023
We report the presence of a simple neural mechanism that represents an input-output function as a vector within autoregressive transformer language models (LMs).
Eric Todd   +5 more
semanticscholar   +1 more source

The JOpera visual composition language

Journal of Visual Languages & Computing, 2005
Composing Web services into a coherent application can be a tedious and error-prone task when using traditional textual scripting languages or emerging XML-based approaches. As an alternative, complex interactions patterns and data exchanges between different Web services can be effectively modeled using a visual language. In this paper, we discuss the
Cesare Pautasso, G. Alonso
semanticscholar   +2 more sources

Structural relations of language and cognitive skills, and topic knowledge to written composition: A test of the direct and indirect effects model of writing.

British Journal of Educational Psychology, 2019
BACKGROUND Writing involves multiple processes, drawing on a number of language, cognitive, and print-related skills, and knowledge. According to the Direct and Indirect Effects model of Writing (DIEW; Kim & Park, 2019, Reading and Writing, 32, 1319 ...
Y. Kim
semanticscholar   +1 more source

InternLM-XComposer2: Mastering Free-form Text-Image Composition and Comprehension in Vision-Language Large Model

arXiv.org
We introduce InternLM-XComposer2, a cutting-edge vision-language model excelling in free-form text-image composition and comprehension. This model goes beyond conventional vision-language understanding, adeptly crafting interleaved text-image content ...
Xiao-wen Dong   +22 more
semanticscholar   +1 more source

Are Logical Languages Compositional?

Studia Logica, 2013
In this paper I argue that in contrast to natural languages, logical languages typically are not compositional. This does not mean that the meaning of expressions cannot be determined at all using some well-defined set of rules. It only means that the meaning of an expression cannot be determined without looking at its form. If one is serious about the
openaire   +2 more sources

Aya Model: An Instruction Finetuned Open-Access Multilingual Language Model

Annual Meeting of the Association for Computational Linguistics
Recent breakthroughs in large language models (LLMs) have centered around a handful of data-rich languages. What does it take to broaden access to breakthroughs beyond first-class citizen languages?
A. Ustun   +16 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy