Composition is the Core Driver of the Language-selective Network
The frontotemporal language network responds robustly and selectively to sentences. But the features of linguistic input that drive this response and the computations that these language areas support remain debated.
Francis Mollica+8 more
doaj +2 more sources
Joint attention and exogenous attention allocation during mother-infant interaction at 12 months associate with 24-month vocabulary composition [PDF]
IntroductionEarly attentional processes are inherently linked with early parent-infant interactions and play a critical role in shaping cognitive and linguistic development.
Elena Capelli+4 more
doaj +2 more sources
Piccola - A Small Composition Language [PDF]
Piccola is a “small composition language” currently being developed within the Software Composition Group. The goal of Piccola is to support the flexible composition of applications from software components.
Oscar Nierstrasz
semanticscholar +2 more sources
Writing Language: Composition, the Academy, and Work [PDF]
This paper argues that while college composition courses are commonly charged with remediating students by providing them with the literacy skills they lack, they may instead be redefined as providing the occasion for rewriting language and knowledge. By
Bruce Horner
doaj +2 more sources
Apocrypha “The Passion of Jesus Christ”: Genesis, Composition, Language Features
The article deals with the issues of genesis, content, and peculiarities of existence in the Russian manuscript tradition of the passionary compiled work "The Passion of Jesus Christ" devoted to the description of the last days of the Saviour's life on ...
Elina Valeryevna Serebryakova
doaj +2 more sources
InternLM-XComposer: A Vision-Language Large Model for Advanced Text-image Comprehension and Composition [PDF]
We propose InternLM-XComposer, a vision-language large model that enables advanced image-text comprehension and composition. The innovative nature of our model is highlighted by three appealing properties: 1) Interleaved Text-Image Composition: InternLM ...
Pan Zhang+18 more
semanticscholar +1 more source
How Abilities in Large Language Models are Affected by Supervised Fine-tuning Data Composition [PDF]
Large language models (LLMs) with enormous pre-training tokens and parameters emerge diverse abilities, including math reasoning, code generation, and instruction following. These abilities are further enhanced by supervised fine-tuning (SFT).
Guanting Dong+9 more
semanticscholar +1 more source
A Compositional Neural Architecture for Language [PDF]
AbstractHierarchical structure and compositionality imbue human language with unparalleled expressive power and set it apart from other perception–action systems. However, neither formal nor neurobiological models account for how these defining computational properties might arise in a physiological system.
Andrea E. Martin, Andrea E. Martin
openaire +10 more sources
Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning [PDF]
The popularity of LLaMA (Touvron et al., 2023a;b) and other recently emerged moderate-sized large language models (LLMs) highlights the potential of building smaller yet powerful LLMs.
Mengzhou Xia+3 more
semanticscholar +1 more source
Composable languages for bioinformatics: the NYoSh experiment [PDF]
Language WorkBenches (LWBs) are software engineering tools that help domain experts develop solutions to various classes of problems. Some of these tools focus on non-technical users and provide languages to help organize knowledge while other ...
Manuele Simi, Fabien Campagne
doaj +2 more sources