Results 161 to 170 of about 6,156,320 (380)
Goldfish: Monolingual Language Models for 350 Languages [PDF]
For many low-resource languages, the only available language models are large multilingual models trained on many languages simultaneously. However, using FLORES perplexity as a metric, we find that these models perform worse than bigrams for many languages (e.g. 24% of languages in XGLM 4.5B; 43% in BLOOM 7.1B).
arxiv
Application of Pretrained Large Language Models in Embodied Artificial Intelligence [PDF]
A. K. Kovalev, Aleksandr I. Panov
openalex +1 more source
This paper argues that large language models have a valuable scientific role to play in serving as scientific models of a language. Linguistic study should not only be concerned with the cognitive processes behind linguistic competence, but also with language understood as an external, social entity. Once this is recognized, the value of large language
arxiv
Mitochondrial DNA disorders in neuromuscular diseases in diverse populations
Abstract Neuromuscular features are common in mitochondrial DNA (mtDNA) disorders. The genetic architecture of mtDNA disorders in diverse populations is poorly understood. We analysed mtDNA variants from whole‐exome sequencing data in neuromuscular patients from South Africa, Brazil, India, Turkey and Zambia. In 998 individuals, there were two definite
Fei Gao+34 more
wiley +1 more source
LLaMA-Reg: Using LLaMA 2 for Unsupervised Medical Image Registration [PDF]
Medical image registration is an essential topic in medical image analysis. In this paper, we propose a method for medical image registration using a pretrained large language model. We find that using the pretrained large language model to encode deep features of the medical images in the registration model can effectively improve image registration ...
arxiv
Dynamic Fusion: Attentional Language Model for Neural Machine Translation [PDF]
Neural Machine Translation (NMT) can be used to generate fluent output. As such, language models have been investigated for incorporation with NMT. In prior investigations, two models have been used: a translation model and a language model. The translation model's predictions are weighted by the language model with a hand-crafted ratio in advance ...
arxiv
Abstract Objectives This study sought to evaluate proteomic, metabolomic, and immune signatures in the cerebrospinal fluid of individuals with Down Syndrome Regression Disorder (DSRD). Methods A prospective case–control study comparing proteomic, metabolomic, and immune profiles in individuals with DSRD was performed.
Jonathan D. Santoro+12 more
wiley +1 more source
Use of the “quick brown fox jumps over the lazy dog” pangram in academic papers
In the English language, when a sentence contains all letters of the alphabet, such as “The/A quick brown fox jumps over the lazy dog”, this is known as a pangram. Curiously, despite its odd meaning, this fox-dog pangram has found practical usage in some
Jaime A. Teixeira da Silva
doaj +1 more source
An MRI assessment of mechanisms underlying lesion growth and shrinkage in multiple sclerosis
By applying the tensor model, we analysed lesion orientation and the directionality of lesion expansion/contraction in multiple sclerosis. Each lesion is summarized as an ellipsoid, and the tensor model is applied to calculate lesion anisotropy. From the top to the bottom white matter atlas, surface‐in gradient segmentation and venous atlas used in the
Ermelinda De Meo+9 more
wiley +1 more source
Large language models in critical care
The advent of chat generative pre-trained transformer (ChatGPT) and large language models (LLMs) has revolutionized natural language processing (NLP).
Laurens A. Biesheuvel+6 more
doaj