Results 11 to 20 of about 2,554,377 (251)
Natural language processing [PDF]
Beginning with the basic issues of NLP, this chapter aims to chart the major research activities in this area since the last ARIST Chapter in 1996 (Haas, 1996), including: (i) natural language text processing systems - text summarization, information ...
Adams+101 more
core +5 more sources
Revisiting Pre-Trained Models for Chinese Natural Language Processing [PDF]
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous improvements across various NLP tasks, and consecutive variants have been proposed to further improve the performance of the pre-trained language models. In this paper, we
Che, Wanxiang+5 more
core +2 more sources
Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing [PDF]
This article surveys and organizes research works in a new paradigm in natural language processing, which we dub “prompt-based learning.” Unlike traditional supervised learning, which trains a model to take in an input x and predict an output y as P(y|x),
Pengfei Liu+5 more
semanticscholar +1 more source
Natural Language Processing [PDF]
Natural language processing (NLP) refers to the field of study that focuses on the interactions between human language and computers. It is a computational approach to text analysis. It is the application of a wide range of computational techniques for the understanding, automatic analysis, and representation of human language.
Yu Zhou+2 more
+8 more sources
Is ChatGPT a General-Purpose Natural Language Processing Task Solver? [PDF]
Spurred by advancements in scale, large language models (LLMs) have demonstrated the ability to perform a variety of natural language processing (NLP) tasks zero-shot -- i.e., without adaptation on downstream data.
Chengwei Qin+5 more
semanticscholar +1 more source
Attention in Natural Language Processing [PDF]
Attention is an increasingly popular mechanism used in a wide range of neural architectures. The mechanism itself has been realized in a variety of formats. However, because of the fast-paced advances in this domain, a systematic overview of attention is still missing.
Andrea Galassi+2 more
openaire +4 more sources
Organizing an in-class hackathon to correct PDF-to-text conversion errors of 1.0 [PDF]
This paper describes a community effort to improve earlier versions of the full-text corpus of Genomics & Informatics by semi-automatically detecting and correcting PDF-to-text conversion errors and optical character recognition errors during the first ...
Sunho Kim+44 more
doaj +1 more source
Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey [PDF]
Large, pre-trained language models (PLMs) such as BERT and GPT have drastically changed the Natural Language Processing (NLP) field. For numerous NLP tasks, approaches leveraging PLMs have achieved state-of-the-art performance. The key idea is to learn a
Bonan Min+8 more
semanticscholar +1 more source
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing [PDF]
Pretraining large neural language models, such as BERT, has led to impressive gains on many natural language processing (NLP) tasks. However, most pretraining efforts focus on general domain corpora, such as newswire and Web.
Yu Gu+8 more
semanticscholar +1 more source
Text-to-SQL is the problem of converting a user question into an SQL query, when the question and database are given. In this article, we present a neural network approach called RYANSQL (Recursively Yielding Annotation Network for SQL) to solve complex ...
DongHyun Choi+3 more
doaj +1 more source