Results 1 to 10 of about 242,005 (287)

Language (Technology) is Power: A Critical Survey of "Bias" in NLP [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2020
We survey 146 papers analyzing "bias" in NLP systems, finding that their motivations are often vague, inconsistent, and lacking in normative reasoning, despite the fact that analyzing "bias" is an inherently normative process.
Barocas, Solon   +3 more
core   +2 more sources

Rationalization for explainable NLP: a survey [PDF]

open access: yesFrontiers in Artificial Intelligence, 2023
Recent advances in deep learning have improved the performance of many Natural Language Processing (NLP) tasks such as translation, question-answering, and text classification. However, this improvement comes at the expense of model explainability. Black-
Sai Gurrapu   +5 more
semanticscholar   +5 more sources

Large Language Models Meet NLP: A Survey [PDF]

open access: yesarXiv
While large language models (LLMs) like ChatGPT have shown impressive capabilities in Natural Language Processing (NLP) tasks, a systematic investigation of their potential in this field remains largely unexplored. This study aims to address this gap by exploring the following questions: (1) How are LLMs currently applied to NLP tasks in the literature?
Libo Qin   +8 more
arxiv   +2 more sources

The Nature of NLP: Analyzing Contributions in NLP Papers [PDF]

open access: yesarXiv
Natural Language Processing (NLP) is a dynamic, interdisciplinary field that integrates intellectual traditions from computer science, linguistics, social science, and more. Despite its established presence, the definition of what constitutes NLP research remains debated.
Pramanick, Aniket   +3 more
arxiv   +3 more sources

Towards Systematic Monolingual NLP Surveys: GenA of Greek NLP [PDF]

open access: yesarXiv
Natural Language Processing (NLP) research has traditionally been predominantly focused on English, driven by the availability of resources, the size of the research community, and market demands. Recently, there has been a noticeable shift towards multilingualism in NLP, recognizing the need for inclusivity and effectiveness across diverse languages ...
Bakagianni, Juli   +3 more
arxiv   +3 more sources

From Pretraining Data to Language Models to Downstream Tasks: Tracking the Trails of Political Biases Leading to Unfair NLP Models [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2023
Language models (LMs) are pretrained on diverse data sources—news, discussion forums, books, online encyclopedias. A significant portion of this data includes facts and opinions which, on one hand, celebrate democracy and diversity of ideas, and on the ...
Shangbin Feng   +3 more
semanticscholar   +1 more source

Preregistering NLP research [PDF]

open access: yesProceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2021
Accepted at NAACL2021; pre-final draft, comments ...
Chris van der Lee   +2 more
openaire   +3 more sources

Super-NaturalInstructions: Generalization via Declarative Instructions on 1600+ NLP Tasks [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2022
How well can NLP models generalize to a variety of unseen tasks when provided with task instructions? To address this question, we first introduce Super-NaturalInstructions, a benchmark of 1,616 diverse NLP tasks and their expert-written instructions ...
Yizhong Wang   +39 more
semanticscholar   +1 more source

Are NLP Models really able to Solve Simple Math Word Problems? [PDF]

open access: yesNorth American Chapter of the Association for Computational Linguistics, 2021
The problem of designing NLP solvers for math word problems (MWP) has seen sustained research activity and steady gains in the test accuracy. Since existing solvers achieve high performance on the benchmark datasets for elementary level MWPs containing ...
Arkil Patel   +2 more
semanticscholar   +1 more source

Grounding ‘Grounding’ in NLP [PDF]

open access: yesFindings of the Association for Computational Linguistics: ACL-IJCNLP 2021, 2021
The NLP community has seen substantial recent interest in grounding to facilitate interaction between language technologies and the world. However, as a community, we use the term broadly to reference any linking of text to data or non-textual modality. In contrast, Cognitive Science more formally defines "grounding" as the process of establishing what
Yonatan Bisk   +2 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy