Results 11 to 20 of about 11,979,430 (376)

BLOOM: A 176B-Parameter Open-Access Multilingual Language Model [PDF]

open access: yesarXiv.org, 2022
Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations ...
Teven Le Scao   +390 more
semanticscholar   +1 more source

Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing [PDF]

open access: yesACM Computing Surveys, 2021
This article surveys and organizes research works in a new paradigm in natural language processing, which we dub “prompt-based learning.” Unlike traditional supervised learning, which trains a model to take in an input x and predict an output y as P(y|x),
Pengfei Liu   +5 more
semanticscholar   +1 more source

Emergent Abilities of Large Language Models [PDF]

open access: yesTrans. Mach. Learn. Res., 2022
Scaling up language models has been shown to predictably improve performance and sample efficiency on a wide range of downstream tasks. This paper instead discusses an unpredictable phenomenon that we refer to as emergent abilities of large language ...
Jason Wei   +15 more
semanticscholar   +1 more source

Whole issue 14

open access: yesLanguage Value, 2021
This is the fourteenth issue of Language Value, the journal created by the Department of English Studies at Universitat Jaume I (UJI) over 12 years ago.
Language Value
doaj   +1 more source

Survey of Hallucination in Natural Language Generation [PDF]

open access: yesACM Computing Surveys, 2022
Natural Language Generation (NLG) has improved exponentially in recent years thanks to the development of sequence-to-sequence deep learning technologies such as Transformer-based language models. This advancement has led to more fluent and coherent NLG,
Ziwei Ji   +11 more
semanticscholar   +1 more source

Whole issue 15.1

open access: yesLanguage Value, 2022
Whole ...
Language Value
doaj   +1 more source

GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding [PDF]

open access: yesBlackboxNLP@EMNLP, 2018
Human ability to understand language is general, flexible, and robust. In contrast, most NLU models above the word level are designed for a specific task and struggle with out-of-domain data.
Alex Wang   +5 more
semanticscholar   +1 more source

Whole issue 15.2

open access: yesLanguage Value, 2022
Whole issue 15 ...
Language Value
doaj   +1 more source

A large annotated corpus for learning natural language inference [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2015
Understanding entailment and contradiction is fundamental to understanding natural language, and inference about entailment and contradiction is a valuable testing ground for the development of semantic representations. However, machine learning research
Samuel R. Bowman   +3 more
semanticscholar   +1 more source

Evolutionary-scale prediction of atomic level protein structure with a language model

open access: yesbioRxiv, 2022
Artificial intelligence has the potential to open insight into the structure of proteins at the scale of evolution. It has only recently been possible to extend protein structure prediction to two hundred million cataloged proteins.
Zeming Lin   +14 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy