Results 21 to 30 of about 13,789,012 (283)

In-Context Convergence of Transformers [PDF]

open access: yesInternational Conference on Machine Learning, 2023
Transformers have recently revolutionized many domains in modern machine learning and one salient discovery is their remarkable in-context learning capability, where models can solve an unseen task by utilizing task-specific prompts without further ...
Yu Huang, Yuan Cheng, Yingbin Liang
semanticscholar   +1 more source

Odontometric parameters in human mandibular molars for sex estimation in a forensic context

open access: yesDental Anthropology, 2021
When encountering human skeletal remains in forensic contexts, one important step in the identification process is the estimation of sex, because it reduces the number of individuals to approximately one half. The pelvis and skull are considered the most
Sofia Fernandes Franco   +4 more
doaj   +1 more source

PRODIGY: Enabling In-context Learning Over Graphs [PDF]

open access: yesNeural Information Processing Systems, 2023
In-context learning is the ability of a pretrained model to adapt to novel and diverse downstream tasks by conditioning on prompt examples, without optimizing any parameters.
Qian Huang   +6 more
semanticscholar   +1 more source

Symbol tuning improves in-context learning in language models [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2023
We present symbol tuning - finetuning language models on in-context input-label pairs where natural language labels (e.g.,"positive/negative sentiment") are replaced with arbitrary symbols (e.g.,"foo/bar"). Symbol tuning leverages the intuition that when
Jerry W. Wei   +10 more
semanticscholar   +1 more source

Context in Systemic Functional Linguistics [PDF]

open access: yesFunctions of Language, 2021
No abstract available.
Tom Bartlett, Wendy L. Bowcher
openaire   +1 more source

HyperAttention: Long-context Attention in Near-Linear Time [PDF]

open access: yesInternational Conference on Learning Representations, 2023
We present an approximate attention mechanism named HyperAttention to address the computational challenges posed by the growing complexity of long contexts used in Large Language Models (LLMs).
Insu Han   +5 more
semanticscholar   +1 more source

Dr.ICL: Demonstration-Retrieved In-context Learning [PDF]

open access: yesData Intelligence, 2023
In-context learning (ICL), teaching a large language model (LLM) to perform a task with few-shot demonstrations rather than adjusting the model parameters, has emerged as a strong paradigm for using LLMs.
Man Luo   +7 more
semanticscholar   +1 more source

In-context Reinforcement Learning with Algorithm Distillation [PDF]

open access: yesInternational Conference on Learning Representations, 2022
We propose Algorithm Distillation (AD), a method for distilling reinforcement learning (RL) algorithms into neural networks by modeling their training histories with a causal sequence model.
M. Laskin   +13 more
semanticscholar   +1 more source

Indicators of Parent-Child Relationships in the Context of Various Socio-Demographic Parameters

open access: yesИнтеграция образования, 2020
Introduction. The article is dedicated to the problem of child-parent relationships (acceptance and behavior control practices by fathers). The context of modern family trends and various socio-demographic indicators of family life in the Russian ...
Artur A. Rean, Ivan A. Konovalov
doaj   +1 more source

Global Context Vision Transformers [PDF]

open access: yesInternational Conference on Machine Learning, 2022
We propose global context vision transformer (GC ViT), a novel architecture that enhances parameter and compute utilization for computer vision. Our method leverages global context self-attention modules, joint with standard local self-attention, to ...
Ali Hatamizadeh   +3 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy