Results 11 to 20 of about 259,269 (327)
Dynamic Contextualized Word Embeddings [PDF]
Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and ...
Hofmann, V +2 more
openaire +2 more sources
Morphological Word-Embeddings [PDF]
Published at NAACL ...
Cotterell, Ryan, Schütze, Hinrich
openaire +2 more sources
Relational Word Embeddings [PDF]
While word embeddings have been shown to implicitly encode various forms of attributional knowledge, the extent to which they capture relational information is far more limited. In previous work, this limitation has been addressed by incorporating relational knowledge from external knowledge bases when learning the word embedding.
Camacho Collados, Jose +2 more
openaire +3 more sources
Creating Welsh Language Word Embeddings [PDF]
Word embeddings are representations of words in a vector space that models semantic relationships between words by means of distance and direction. In this study, we adapted two existing methods, word2vec and fastText, to automatically learn Welsh word embeddings taking into account syntactic and morphological idiosyncrasies of this language.
Padraig Corcoran +4 more
openaire +3 more sources
Name Entity Recognition for Military Based on Domain Adaptive Embedding [PDF]
In order to solve the poor quality problem of domain embedding space caused by inadequate military corpus which makes low accuracy of applying deep neural network model to military named entity recognition,this paper introduces a domain adaptive method ...
LIU Kai, ZHANG Hong-jun, CHEN Fei-qiong
doaj +1 more source
Sentence model based subword embeddings for a dialog system
This study focuses on improving a word embedding model to enhance the performance of downstream tasks, such as those of dialog systems. To improve traditional word embedding models, such as skip-gram, it is critical to refine the word features and expand
Euisok Chung +2 more
doaj +1 more source
All Word Embeddings from One Embedding [PDF]
NeurIPS ...
Sho Takase, Shunsuke Kobayashi
openalex +3 more sources
Joint Fine-Grained Components Continuously Enhance Chinese Word Embeddings
The most common method of word embedding is to learn word vector representations from context information of large-scale text. However, Chinese words usually consist of characters, subcharacters, and strokes, and each part contains rich semantic ...
Chengyang Zhuang +3 more
doaj +1 more source
Evaluating the Underlying Gender Bias in Contextualized Word Embeddings [PDF]
Gender bias is highly impacting natural language processing applications. Word embeddings have clearly been proven both to keep and amplify gender biases that are present in current data sources.
Basta, Christine +2 more
core +2 more sources
Using Word Embedding to Evaluate the Coherence of Topics from Twitter Data [PDF]
Scholars often seek to understand topics discussed on Twitter using topic modelling approaches. Several coherence metrics have been proposed for evaluating the coherence of the topics generated by these approaches, including the pre-calculated ...
Fang, Anjie +3 more
core +1 more source

