Results 261 to 270 of about 13,227,822 (323)
Some of the next articles are maybe not open access.
TinyLlama: An Open-Source Small Language Model
arXiv.orgWe present TinyLlama, a compact 1.1B language model pretrained on around 1 trillion tokens for approximately 3 epochs. Building on the architecture and tokenizer of Llama 2, TinyLlama leverages various advances contributed by the open-source community (e.
Peiyuan Zhang +3 more
semanticscholar +1 more source
BioMistral: A Collection of Open-Source Pretrained Large Language Models for Medical Domains
Annual Meeting of the Association for Computational LinguisticsLarge Language Models (LLMs) have demonstrated remarkable versatility in recent years, offering potential applications across specialized domains such as healthcare and medicine.
Yanis Labrak +5 more
semanticscholar +1 more source
Source-to-source translation of visual languages
Nord. J. Comput., 2004Summary: We study the problem of translation between visual languages. Languages with a visual syntax are currently in wide use, due to the popularity of different kinds of diagrams used especially in software analysis, design, and animation. Unlike conventional textual languages, visual languages are still quite immature with regard to the formalisms ...
Jukka Paakki, Antti-Pekka Tuovinen
openaire +1 more source
InternVL3: Exploring Advanced Training and Test-Time Recipes for Open-Source Multimodal Models
arXiv.orgWe introduce InternVL3, a significant advancement in the InternVL series featuring a native multimodal pre-training paradigm. Rather than adapting a text-only large language model (LLM) into a multimodal large language model (MLLM) that supports visual ...
Jinguo Zhu +47 more
semanticscholar +1 more source
Source Language Interference in English-to-Chinese Translation
Yearbook of Corpus Linguistics and Pragmatics, 2015Richard Xiao
exaly +2 more sources
InternVL3.5: Advancing Open-Source Multimodal Models in Versatility, Reasoning, and Efficiency
arXiv.orgWe introduce InternVL 3.5, a new family of open-source multimodal models that significantly advances versatility, reasoning capability, and inference efficiency along the InternVL series. A key innovation is the Cascade Reinforcement Learning (Cascade RL)
Weiyun Wang +62 more
semanticscholar +1 more source
arXiv.org
We introduce InternVL 2.5, an advanced multimodal large language model (MLLM) series that builds upon InternVL 2.0, maintaining its core model architecture while introducing significant enhancements in training and testing strategies as well as data ...
Zhe Chen +39 more
semanticscholar +1 more source
We introduce InternVL 2.5, an advanced multimodal large language model (MLLM) series that builds upon InternVL 2.0, maintaining its core model architecture while introducing significant enhancements in training and testing strategies as well as data ...
Zhe Chen +39 more
semanticscholar +1 more source
2003
MSN demonstrates that the most important contributor languages for Israeli are: (i) Indo-European — mostly Germanic and Slavonic: Yiddish, Polish, Russian, English and German; (ii) Western Semitic: Hebrew, Arabic and Aramaic (on Aramaic, see Eliezer Meir Lipschutz in ZV 4, 1914: 20).
openaire +1 more source
MSN demonstrates that the most important contributor languages for Israeli are: (i) Indo-European — mostly Germanic and Slavonic: Yiddish, Polish, Russian, English and German; (ii) Western Semitic: Hebrew, Arabic and Aramaic (on Aramaic, see Eliezer Meir Lipschutz in ZV 4, 1914: 20).
openaire +1 more source
Octo: An Open-Source Generalist Robot Policy
Robotics: Science and SystemsLarge policies pretrained on diverse robot datasets have the potential to transform robotic learning: instead of training new policies from scratch, such generalist robot policies may be finetuned with only a little in-domain data, yet generalize broadly.
O. Team +17 more
semanticscholar +1 more source
Source Code Summarization in the Era of Large Language Models
International Conference on Software EngineeringTo support software developers in understanding and maintaining programs, various automatic (source) code summarization techniques have been proposed to generate a concise natural language summary (i.e., comment) for a given code snippet.
Weisong Sun +8 more
semanticscholar +1 more source

