Results 1 to 10 of about 89,062 (279)

Creating Welsh Language Word Embeddings [PDF]

open access: yesApplied Sciences, 2021
Word embeddings are representations of words in a vector space that models semantic relationships between words by means of distance and direction. In this study, we adapted two existing methods, word2vec and fastText, to automatically learn Welsh word ...
Padraig Corcoran   +4 more
doaj   +3 more sources

Word embeddings as autonomous predictors in materials design—the effect of inherent variability on information transfer [PDF]

open access: yesJournal of Cheminformatics
We propose that word embeddings of atoms derived from scientific literature are revisited as autonomous machine learning predictors in materials design.
Jana Radaković   +2 more
doaj   +2 more sources

Training and intrinsic evaluation of lightweight word embeddings for the clinical domain in Spanish [PDF]

open access: yesFrontiers in Artificial Intelligence, 2022
Resources for Natural Language Processing (NLP) are less numerous for languages different from English. In the clinical domain, where these resources are vital for obtaining new knowledge about human health and diseases, creating new resources for the ...
Carolina Chiu   +10 more
doaj   +2 more sources

Word Embeddings as Statistical Estimators. [PDF]

open access: yesSankhya Ser B
Word embeddings are a fundamental tool in natural language processing. Currently, word embedding methods are evaluated on the basis of empirical performance on benchmark data sets, and there is a lack of rigorous understanding of their theoretical properties.
Dey N   +3 more
europepmc   +3 more sources

Comparing general and specialized word embeddings for biomedical named entity recognition [PDF]

open access: yesPeerJ Computer Science, 2021
Increased interest in the use of word embeddings, such as word representation, for biomedical named entity recognition (BioNER) has highlighted the need for evaluations that aid in selecting the best word embedding to be used.
Rigo E. Ramos-Vargas   +2 more
doaj   +2 more sources

Dynamic Contextualized Word Embeddings [PDF]

open access: yesProceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 2021
Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and ...
Hofmann, V   +2 more
openaire   +2 more sources

Clustering and Visualising Documents using Word Embeddings

open access: yesThe Programming Historian, 2023
This lesson uses word embeddings and clustering algorithms in Python to identify groups of similar documents in a corpus of approximately 9,000 academic abstracts.
Jonathan Reades, Jennie Williams
doaj   +1 more source

Morphological Word-Embeddings [PDF]

open access: yesProceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2015
Published at NAACL ...
Cotterell, Ryan, Schütze, Hinrich
openaire   +2 more sources

Relational Word Embeddings [PDF]

open access: yesProceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019
While word embeddings have been shown to implicitly encode various forms of attributional knowledge, the extent to which they capture relational information is far more limited. In previous work, this limitation has been addressed by incorporating relational knowledge from external knowledge bases when learning the word embedding.
Camacho Collados, Jose   +2 more
openaire   +3 more sources

Improving Word Embedding Using Variational Dropout

open access: yesProceedings of the International Florida Artificial Intelligence Research Society Conference, 2023
Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings.
Zainab Albujasim   +3 more
doaj   +1 more source

Home - About - Disclaimer - Privacy