PromptSource: An Integrated Development Environment and Repository for Natural Language Prompts [PDF]
PromptSource is a system for creating, sharing, and using natural language prompts. Prompts are functions that map an example from a dataset to a natural language input and target output.
Stephen H. Bach+25 more
semanticscholar +1 more source
Performance Study of N-grams in the Analysis of Sentiments
In this work, a study investigation was carried out using n-grams to classify sentiments with different machine learning and deep learning methods. We used this approach, which combines existing techniques, with the problem of predicting sequence tags to
O. E. Ojo+3 more
doaj +1 more source
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension [PDF]
We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
M. Lewis+7 more
semanticscholar +1 more source
Translating Natural Language to Planning Goals with Large-Language Models [PDF]
Recent large language models (LLMs) have demonstrated remarkable performance on a variety of natural language processing (NLP) tasks, leading to intense excitement about their applicability across various domains.
Yaqi Xie+5 more
semanticscholar +1 more source
Semi‐supervised classification of fundus images combined with CNN and GCN
Abstract Purpose Diabetic retinopathy (DR) is one of the most serious complications of diabetes, which is a kind of fundus lesion with specific changes. Early diagnosis of DR can effectively reduce the visual damage caused by DR. Due to the variety and different morphology of DR lesions, automatic classification of fundus images in mass screening can ...
Sixu Duan+8 more
wiley +1 more source
Design of autonomous family companion robot based on ROS
Aiming at the problem of left-behind children, empty-nest elderly, and people with reduced mobility, an autonomous family companion robot was designed.
Li Jianyong+3 more
doaj +1 more source
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding [PDF]
Human ability to understand language is general, flexible, and robust. In contrast, most NLU models above the word level are designed for a specific task and struggle with out-of-domain data.
Alex Wang+5 more
semanticscholar +1 more source
CLAP Learning Audio Concepts from Natural Language Supervision
Mainstream machine listening models are trained to learn audio concepts under the paradigm of one class label to many recordings focusing on one task.
Benjamin Elizalde+3 more
semanticscholar +1 more source
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing [PDF]
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web.
Yu Gu+8 more
semanticscholar +1 more source
Translation between Molecules and Natural Language [PDF]
We present MolT5 - a self-supervised learning framework for pretraining models on a vast amount of unlabeled natural language text and molecule strings.
Carl N. Edwards+4 more
semanticscholar +1 more source