Composition is the Core Driver of the Language-selective Network
The frontotemporal language network responds robustly and selectively to sentences. But the features of linguistic input that drive this response and the computations that these language areas support remain debated.
Francis Mollica +8 more
doaj +2 more sources
Lookback supports semi-parallel, just-in-time processing in second language written composition. [PDF]
PurposeCompetent written production results from semi-parallel processes that flow from content determination to finger movement. Information from upstream processes that is not used immediately is easily lost.
Evgeny Chukharev +2 more
doaj +3 more sources
Joint attention and exogenous attention allocation during mother-infant interaction at 12 months associate with 24-month vocabulary composition [PDF]
IntroductionEarly attentional processes are inherently linked with early parent-infant interactions and play a critical role in shaping cognitive and linguistic development.
Elena Capelli +4 more
doaj +2 more sources
InternLM-XComposer: A Vision-Language Large Model for Advanced Text-image Comprehension and Composition [PDF]
We propose InternLM-XComposer, a vision-language large model that enables advanced image-text comprehension and composition. The innovative nature of our model is highlighted by three appealing properties: 1) Interleaved Text-Image Composition: InternLM ...
Pan Zhang +18 more
semanticscholar +1 more source
How Abilities in Large Language Models are Affected by Supervised Fine-tuning Data Composition [PDF]
Large language models (LLMs) with enormous pre-training tokens and parameters emerge diverse abilities, including math reasoning, code generation, and instruction following. These abilities are further enhanced by supervised fine-tuning (SFT).
Guanting Dong +9 more
semanticscholar +1 more source
LoraHub: Efficient Cross-Task Generalization via Dynamic LoRA Composition [PDF]
Low-rank adaptations (LoRA) are often employed to fine-tune large language models (LLMs) for new tasks. This paper investigates LoRA composability for cross-task generalization and introduces LoraHub, a simple framework devised for the purposive assembly
Chengsong Huang +5 more
semanticscholar +1 more source
XSTest: A Test Suite for Identifying Exaggerated Safety Behaviours in Large Language Models [PDF]
Without proper safeguards, large language models will readily follow malicious instructions and generate toxic content. This risk motivates safety efforts such as red-teaming and large-scale feedback learning, which aim to make models both helpful and ...
Paul Röttger +5 more
semanticscholar +1 more source
Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning [PDF]
The popularity of LLaMA (Touvron et al., 2023a;b) and other recently emerged moderate-sized large language models (LLMs) highlights the potential of building smaller yet powerful LLMs.
Mengzhou Xia +3 more
semanticscholar +1 more source
Composable languages for bioinformatics: the NYoSh experiment [PDF]
Language WorkBenches (LWBs) are software engineering tools that help domain experts develop solutions to various classes of problems. Some of these tools focus on non-technical users and provide languages to help organize knowledge while other ...
Manuele Simi, Fabien Campagne
doaj +2 more sources
Writing Composition Problem in Arabic Language Learning Among Arabic Language Education Students
Learning foreign languages, especially Arabic, is still an obstacle for non-Arabic speakers. Apart from the problem of grammatical gaps, and phonological differences between source and target languages, writing composition proficiency is also a problem ...
Zulaeha Zulaeha
doaj +1 more source

