Results 1 to 10 of about 2,531,064 (218)

Revisiting Pre-Trained Models for Chinese Natural Language Processing [PDF]

open access: yesFindings, 2020
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous improvements across various NLP tasks, and consecutive variants have been proposed to further improve the performance of the pre-trained language models. In this paper, we
Che, Wanxiang   +5 more
core   +2 more sources

Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing [PDF]

open access: yesACM Computing Surveys, 2021
This article surveys and organizes research works in a new paradigm in natural language processing, which we dub “prompt-based learning.” Unlike traditional supervised learning, which trains a model to take in an input x and predict an output y as P(y|x),
Pengfei Liu   +5 more
semanticscholar   +1 more source

Is ChatGPT a General-Purpose Natural Language Processing Task Solver? [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2023
Spurred by advancements in scale, large language models (LLMs) have demonstrated the ability to perform a variety of natural language processing (NLP) tasks zero-shot -- i.e., without adaptation on downstream data.
Chengwei Qin   +5 more
semanticscholar   +1 more source

Organizing an in-class hackathon to correct PDF-to-text conversion errors of 1.0 [PDF]

open access: yesGenomics & Informatics, 2020
This paper describes a community effort to improve earlier versions of the full-text corpus of Genomics & Informatics by semi-automatically detecting and correcting PDF-to-text conversion errors and optical character recognition errors during the first ...
Sunho Kim   +44 more
doaj   +1 more source

Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey [PDF]

open access: yesACM Computing Surveys, 2021
Large, pre-trained language models (PLMs) such as BERT and GPT have drastically changed the Natural Language Processing (NLP) field. For numerous NLP tasks, approaches leveraging PLMs have achieved state-of-the-art performance. The key idea is to learn a
Bonan Min   +8 more
semanticscholar   +1 more source

Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing [PDF]

open access: yesACM Trans. Comput. Heal., 2020
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web.
Yu Gu   +8 more
semanticscholar   +1 more source

RYANSQL: Recursively Applying Sketch-based Slot Fillings for Complex Text-to-SQL in Cross-Domain Databases

open access: yesComputational Linguistics, 2021
Text-to-SQL is the problem of converting a user question into an SQL query, when the question and database are given. In this article, we present a neural network approach called RYANSQL (Recursively Yielding Annotation Network for SQL) to solve complex ...
DongHyun Choi   +3 more
doaj   +1 more source

Stanza: A Python Natural Language Processing Toolkit for Many Human Languages [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2020
We introduce Stanza, an open-source Python natural language processing toolkit supporting 66 human languages. Compared to existing widely used toolkits, Stanza features a language-agnostic fully neural pipeline for text analysis, including tokenization ...
Peng Qi   +4 more
semanticscholar   +1 more source

Performance Study of N-grams in the Analysis of Sentiments

open access: yesJournal of Nigerian Society of Physical Sciences, 2021
In this work, a study investigation was carried out using n-grams to classify sentiments with different machine learning and deep learning methods. We used this approach, which combines existing techniques, with the problem of predicting sequence tags to
O. E. Ojo   +3 more
doaj   +1 more source

PyThaiNLP: Thai Natural Language Processing in Python [PDF]

open access: yesNLPOSS, 2023
We present PyThaiNLP, a free and open-source natural language processing (NLP) library for Thai language implemented in Python. It provides a wide range of software, models, and datasets for Thai language.
Wannaphong Phatthiyaphaibun   +8 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy