Results 11 to 20 of about 9,473,317 (319)

Translating Natural Language to Planning Goals with Large-Language Models [PDF]

open access: yesarXiv.org, 2023
Recent large language models (LLMs) have demonstrated remarkable performance on a variety of natural language processing (NLP) tasks, leading to intense excitement about their applicability across various domains.
Yaqi Xie   +5 more
semanticscholar   +1 more source

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2019
We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
M. Lewis   +7 more
semanticscholar   +1 more source

Making Sense of Language Signals for Monitoring Radicalization

open access: yesApplied Sciences, 2022
Understanding radicalization pathways, drivers, and factors is essential for the effective design of prevention and counter-radicalization programs. Traditionally, the primary methods used by social scientists to detect these drivers and factors include ...
Óscar Araque   +7 more
doaj   +1 more source

GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding [PDF]

open access: yesBlackboxNLP@EMNLP, 2018
Human ability to understand language is general, flexible, and robust. In contrast, most NLU models above the word level are designed for a specific task and struggle with out-of-domain data.
Alex Wang   +5 more
semanticscholar   +1 more source

CLAP Learning Audio Concepts from Natural Language Supervision

open access: yesIEEE International Conference on Acoustics, Speech, and Signal Processing, 2023
Mainstream machine listening models are trained to learn audio concepts under the paradigm of one class label to many recordings focusing on one task.
Benjamin Elizalde   +3 more
semanticscholar   +1 more source

Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing [PDF]

open access: yesACM Trans. Comput. Heal., 2020
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web.
Yu Gu   +8 more
semanticscholar   +1 more source

A large annotated corpus for learning natural language inference [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2015
Understanding entailment and contradiction is fundamental to understanding natural language, and inference about entailment and contradiction is a valuable testing ground for the development of semantic representations. However, machine learning research
Samuel R. Bowman   +3 more
semanticscholar   +1 more source

Stanza: A Python Natural Language Processing Toolkit for Many Human Languages [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2020
We introduce Stanza, an open-source Python natural language processing toolkit supporting 66 human languages. Compared to existing widely used toolkits, Stanza features a language-agnostic fully neural pipeline for text analysis, including tokenization ...
Peng Qi   +4 more
semanticscholar   +1 more source

Translation between Molecules and Natural Language [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2022
We present MolT5 - a self-supervised learning framework for pretraining models on a vast amount of unlabeled natural language text and molecule strings.
Carl N. Edwards   +4 more
semanticscholar   +1 more source

Natural Language Reasoning, A Survey [PDF]

open access: yesACM Computing Surveys, 2023
This survey article proposes a clearer view of Natural Language Reasoning (NLR) in the field of Natural Language Processing (NLP), both conceptually and practically.
Fei Yu, Hongbo Zhang, Benyou Wang
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy