Results 281 to 290 of about 22,713,326 (323)
Some of the next articles are maybe not open access.
2010
Adopting an integrated approach, the contribution demonstrates that context can no longer be seen as an analytic prime. Rather than being looked upon as an external constraint on linguistic performance, it is analyzed as a product of language use, as interactionally constructed and as negotiated.
openaire +1 more source
Adopting an integrated approach, the contribution demonstrates that context can no longer be seen as an analytic prime. Rather than being looked upon as an external constraint on linguistic performance, it is analyzed as a product of language use, as interactionally constructed and as negotiated.
openaire +1 more source
Long-context LLMs Struggle with Long In-context Learning
Trans. Mach. Learn. Res.Large Language Models (LLMs) have made significant strides in handling long sequences. Some models like Gemini could even to be capable of dealing with millions of tokens.
Tianle Li +4 more
semanticscholar +1 more source
2015
Fil: Chornogubsky Clerici, Laura. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario.
Goin, Francisco Javier +4 more
openaire +2 more sources
Fil: Chornogubsky Clerici, Laura. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Parque Centenario.
Goin, Francisco Javier +4 more
openaire +2 more sources
Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention
arXiv.orgThis work introduces an efficient method to scale Transformer-based Large Language Models (LLMs) to infinitely long inputs with bounded memory and computation. A key component in our proposed approach is a new attention technique dubbed Infini-attention.
Tsendsuren Munkhdalai +2 more
semanticscholar +1 more source
Experientia, 1990
The analysis of coding sequences reveals nonrandomness in the context of both sense and stop codons. Part of this is related to nucleotide doublet preference, seen also in non-coding sequences and thought to arise from the dependence of mutational events on surrounding sequence.
openaire +2 more sources
The analysis of coding sequences reveals nonrandomness in the context of both sense and stop codons. Part of this is related to nucleotide doublet preference, seen also in non-coding sequences and thought to arise from the dependence of mutational events on surrounding sequence.
openaire +2 more sources
2011
In this paper we consider an operation of inserting contexts in a word controlled by a contextual scheme X which provides a selection criterion for contextual insertion. We say that a language L is k-stable w.r.t. a contextual scheme X if by making any k context insertions in a word of L we still obtain a word of L; L is k-anti-stable w.r.t.
BOTTONI, Paolo Gaspare +4 more
openaire +2 more sources
In this paper we consider an operation of inserting contexts in a word controlled by a contextual scheme X which provides a selection criterion for contextual insertion. We say that a language L is k-stable w.r.t. a contextual scheme X if by making any k context insertions in a word of L we still obtain a word of L; L is k-anti-stable w.r.t.
BOTTONI, Paolo Gaspare +4 more
openaire +2 more sources
Context and Context Management
2019The user and user requirements are the main important concerns to design a pervasive computing system. The context of a user or device is another essential element in designing a context-aware pervasive computing system. This chapter is focused on what is the context, the variety of contexts, and the impact of that context information in developing a ...
Parikshit N. Mahalle, Prashant S. Dhotre
openaire +1 more source
In-Context Learning with Long-Context Models: An In-Depth Exploration
North American Chapter of the Association for Computational LinguisticsAs model context lengths continue to increase, the number of demonstrations that can be provided in-context approaches the size of entire training datasets. We study the behavior of in-context learning (ICL) at this extreme scale on multiple datasets and
Amanda Bertsch +5 more
semanticscholar +1 more source
The MovieLens Datasets: History and Context
TIIS, 2016F. M. Harper, J. Konstan, J. A.
semanticscholar +1 more source

