Results 21 to 30 of about 9,260,716 (321)

Performance Study of N-grams in the Analysis of Sentiments

open access: yesJournal of Nigerian Society of Physical Sciences, 2021
In this work, a study investigation was carried out using n-grams to classify sentiments with different machine learning and deep learning methods. We used this approach, which combines existing techniques, with the problem of predicting sequence tags to
O. E. Ojo   +3 more
doaj   +1 more source

Translating Natural Language to Planning Goals with Large-Language Models [PDF]

open access: yesarXiv.org, 2023
Recent large language models (LLMs) have demonstrated remarkable performance on a variety of natural language processing (NLP) tasks, leading to intense excitement about their applicability across various domains.
Yaqi Xie   +5 more
semanticscholar   +1 more source

Design of autonomous family companion robot based on ROS

open access: yesDianzi Jishu Yingyong, 2021
Aiming at the problem of left-behind children, empty-nest elderly, and people with reduced mobility, an autonomous family companion robot was designed.
Li Jianyong   +3 more
doaj   +1 more source

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2019
We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
M. Lewis   +7 more
semanticscholar   +1 more source

CLAP Learning Audio Concepts from Natural Language Supervision

open access: yesIEEE International Conference on Acoustics, Speech, and Signal Processing, 2023
Mainstream machine listening models are trained to learn audio concepts under the paradigm of one class label to many recordings focusing on one task.
Benjamin Elizalde   +3 more
semanticscholar   +1 more source

GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding [PDF]

open access: yesBlackboxNLP@EMNLP, 2018
Human ability to understand language is general, flexible, and robust. In contrast, most NLU models above the word level are designed for a specific task and struggle with out-of-domain data.
Alex Wang   +5 more
semanticscholar   +1 more source

Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing [PDF]

open access: yesACM Trans. Comput. Heal., 2020
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web.
Yu Gu   +8 more
semanticscholar   +1 more source

Translation between Molecules and Natural Language [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2022
We present MolT5 - a self-supervised learning framework for pretraining models on a vast amount of unlabeled natural language text and molecule strings.
Carl N. Edwards   +4 more
semanticscholar   +1 more source

A large annotated corpus for learning natural language inference [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2015
Understanding entailment and contradiction is fundamental to understanding natural language, and inference about entailment and contradiction is a valuable testing ground for the development of semantic representations. However, machine learning research
Samuel R. Bowman   +3 more
semanticscholar   +1 more source

Stanza: A Python Natural Language Processing Toolkit for Many Human Languages [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2020
We introduce Stanza, an open-source Python natural language processing toolkit supporting 66 human languages. Compared to existing widely used toolkits, Stanza features a language-agnostic fully neural pipeline for text analysis, including tokenization ...
Peng Qi   +4 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy