Results 281 to 290 of about 3,659,116 (353)
Some of the next articles are maybe not open access.

HuatuoGPT-II, One-stage Training for Medical Adaption of LLMs

arXiv.org, 2023
Adapting a language model into a specific domain, a.k.a `domain adaption', is a common practice when specialized knowledge, e.g. medicine, is not encapsulated in a general language model like Llama2. The challenge lies in the heterogeneity of data across
Junying Chen   +12 more
semanticscholar   +1 more source

Depicting Beyond Scores: Advancing Image Quality Assessment through Multi-modal Language Models

European Conference on Computer Vision, 2023
We introduce a Depicted image Quality Assessment method (DepictQA), overcoming the constraints of traditional score-based methods. DepictQA allows for detailed, language-based, human-like evaluation of image quality by leveraging Multi-modal Large ...
Zhiyuan You   +5 more
semanticscholar   +1 more source

A Survey on LLM-based Code Generation for Low-Resource and Domain-Specific Programming Languages

ACM Transactions on Software Engineering and Methodology
Large Language Models (LLMs) have shown remarkable capabilities in code generation for popular programming languages. However, their performance in Low-Resource Programming Languages (LRPLs) and Domain-Specific Languages (DSLs) remains a critical ...
Sathvik Joel, J. Wu, Fatemeh H. Fard
semanticscholar   +1 more source

LLaMAX: Scaling Linguistic Horizons of LLM by Enhancing Translation Capabilities Beyond 100 Languages

Conference on Empirical Methods in Natural Language Processing
Large Language Models (LLMs) demonstrate remarkable translation capabilities in high-resource language tasks, yet their performance in low-resource languages is hindered by insufficient multilingual data during pre-training.
Yinquan Lu   +4 more
semanticscholar   +1 more source

Hemispheric specialization for sign language

Neuropsychologia, 1996
Most studies on sign lateralization provide inconclusive results about the role of the two hemispheres in sign language processing, whereas the cases reported in the clinical literature show sign language impairment only following left hemisphere damage, suggesting a similar neural organization to spoken languages. By discriminating different levels of
G. Grossi   +3 more
openaire   +3 more sources

PUMA: A Programmable Ultra-efficient Memristor-based Accelerator for Machine Learning Inference

International Conference on Architectural Support for Programming Languages and Operating Systems, 2019
Memristor crossbars are circuits capable of performing analog matrix-vector multiplications, overcoming the fundamental energy efficiency limitations of digital logic. They have been shown to be effective in special-purpose accelerators for a limited set
Aayush Ankit   +10 more
semanticscholar   +1 more source

Special Languages

1997
Abstract The earliest explicit contrast between poetic and prosaic language is made (about 365) by lsocrates (ix. 9 f.). The lexemes, he says, which poets are able to use are not only áτ τ∈τγμ ναga those prescribed (sc. by general usage) but also those which are ‘alien’, Kaiv&.
openaire   +1 more source

Gemini Embedding: Generalizable Embeddings from Gemini

arXiv.org
In this report, we introduce Gemini Embedding, a state-of-the-art embedding model leveraging the power of Gemini, Google's most capable large language model.
Jinhyuk Lee   +46 more
semanticscholar   +1 more source

COMMON LANGUAGE VERSUS SPECIALIZED LANGUAGE [PDF]

open access: possibleJournal of Information Systems and Operations Management, 2011
This paper deals with the presentation of the common language and the specialized one. We also highlighted the relations and the differences between them. The specialized language is a vector of specialized knowledge, but sometimes it contains units from the common language.
openaire  

Special Families of Sewing Languages

2000
Journal of Automata, Languages and Combinatorics, Volume 5, Number 3, 2000, 279 ...
Martín-Vide, Carlos   +1 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy