Results 61 to 70 of about 7,007,067 (368)
Information Flow in Pregroup Models of Natural Language [PDF]
This paper is about pregroup models of natural languages, and how they relate to the explicitly categorical use of pregroups in Compositional Distributional Semantics and Natural Language Processing. These categorical interpretations make certain assumptions about the nature of natural languages that, when stated formally, may be seen to impose strong ...
arxiv +1 more source
Sentences share with equations properties of discrete, linear infinities; distinct symbol types; alternation of symbol types in the signal; deep-structure with a main verb and recursion; associative, commutative, and distributive properties; autonomous levels of organization; paraphrase, ellipsis, and ambiguity; powers of assertion, truth and falsity;
openaire +3 more sources
PIQA: Reasoning about Physical Commonsense in Natural Language [PDF]
To apply eyeshadow without a brush, should I use a cotton swab or a toothpick? Questions requiring this kind of physical commonsense pose a challenge to today's natural language understanding systems.
Yonatan Bisk+4 more
semanticscholar +1 more source
Natural Language to Code: How Far Are We?
A longstanding dream in software engineering research is to devise effective approaches for automating development tasks based on developers' informally-specified intentions. Such intentions are generally in the form of natural language descriptions.
Shangwen Wang+9 more
semanticscholar +1 more source
Revisiting Pre-Trained Models for Chinese Natural Language Processing [PDF]
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous improvements across various NLP tasks, and consecutive variants have been proposed to further improve the performance of the pre-trained language models. In this paper, we
Yiming Cui+5 more
semanticscholar +1 more source
Using natural language prompts for machine translation [PDF]
We explore the use of natural language prompts for controlling various aspects of the outputs generated by machine translation models. We demonstrate that natural language prompts allow us to influence properties like formality or specific dialect of the output.
arxiv
Language is God's unique gift to humanity. Human civilization as we know it today would have been impossible without language. Language is all around us. It can be found in our thoughts and dreams, prayers and meditations, relationships and communication.
openaire +1 more source
Multi-Task Deep Neural Networks for Natural Language Understanding [PDF]
In this paper, we present a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple natural language understanding (NLU) tasks.
Xiaodong Liu+3 more
semanticscholar +1 more source
Brains and algorithms partially converge in natural language processing
Deep learning algorithms trained to predict masked words from large amount of text have recently been shown to generate activations similar to those of the human brain. However, what drives this similarity remains currently unknown.
C. Caucheteux, J. King
semanticscholar +1 more source
A Survey of the Usages of Deep Learning for Natural Language Processing
Over the last several years, the field of natural language processing has been propelled forward by an explosion in the use of deep learning models.
Dan Otter, Julian R. Medina, J. Kalita
semanticscholar +1 more source