Results 11 to 20 of about 16,624 (311)
Creating Welsh Language Word Embeddings [PDF]
Word embeddings are representations of words in a vector space that models semantic relationships between words by means of distance and direction. In this study, we adapted two existing methods, word2vec and fastText, to automatically learn Welsh word ...
Padraig Corcoran +4 more
doaj +3 more sources
Dynamic Contextualized Word Embeddings [PDF]
Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and ...
Hofmann, V +2 more
openaire +2 more sources
Morphological Word-Embeddings [PDF]
Published at NAACL ...
Cotterell, Ryan, Schütze, Hinrich
openaire +2 more sources
Relational Word Embeddings [PDF]
While word embeddings have been shown to implicitly encode various forms of attributional knowledge, the extent to which they capture relational information is far more limited. In previous work, this limitation has been addressed by incorporating relational knowledge from external knowledge bases when learning the word embedding.
Camacho Collados, Jose +2 more
openaire +3 more sources
Slovene and Croatian word embeddings in terms of gender occupational analogies
In recent years, the use of deep neural networks and dense vector embeddings for text representation have led to excellent results in the field of computational understanding of natural language.
Matej Ulčar +3 more
doaj +1 more source
Word Embeddings as Statistical Estimators. [PDF]
Word embeddings are a fundamental tool in natural language processing. Currently, word embedding methods are evaluated on the basis of empirical performance on benchmark data sets, and there is a lack of rigorous understanding of their theoretical properties.
Dey N +3 more
europepmc +3 more sources
Background In the past few years, neural word embeddings have been widely used in text mining. However, the vector representations of word embeddings mostly act as a black box in downstream applications using them, thereby limiting their interpretability.
Zhiwei Chen +3 more
doaj +1 more source
Acoustic Word Embeddings for End-to-End Speech Synthesis
The most recent end-to-end speech synthesis systems use phonemes as acoustic input tokens and ignore the information about which word the phonemes come from.
Feiyu Shen, Chenpeng Du, Kai Yu
doaj +1 more source
Contextual Word Embedding [PDF]
Effective clustering of short documents, such as tweets, is difficult because of the lack of sufficient semantic context. Word embedding is a technique that is effective in addressing this lack of semantic context. However, the process of word vector embedding, in turn, relies on the availability of sufficient contexts to learn the word associations ...
Debasis Ganguly, Kripabandhu Ghosh
openaire +1 more source
All Word Embeddings from One Embedding [PDF]
NeurIPS ...
Sho Takase, Shunsuke Kobayashi
openalex +3 more sources

