Results 1 to 10 of about 2,031,370 (253)
Dynamic Contextualized Word Embeddings [PDF]
Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and ...
Valentin Hofmann+2 more
arxiv +7 more sources
Activation of embedded words in spoken word recognition. [PDF]
Tilburg University Beatrice de Gelder Tilburg University and Universit6 Libre de Bruxelles Three cross-modal associative priming experiments investigated whether speech input acti- vates words that are embedded in other words.
Jean Vroomen, Béatrice de Gelder
+7 more sources
Using word embeddings to investigate cultural biases. [PDF]
Word embeddings provide quantitative representations of word semantics and the associations between word meanings in text data, including in large repositories in media and social media archives.
Durrheim K+3 more
europepmc +2 more sources
Historical representations of social groups across 200 years of word embeddings from Google Books. [PDF]
Significance How did societies of the past represent the various social groups of their world? Here, we address this question using word embeddings from 850 billion words of English-language books (from 1800 to 1999) to uncover principles of change and ...
Charlesworth TES, Caliskan A, Banaji MR.
europepmc +2 more sources
Morphological Word-Embeddings [PDF]
Published at NAACL ...
Ryan Cotterell, Hinrich Schütze
openalex +4 more sources
Dictionary-based Debiasing of Pre-trained Word Embeddings [PDF]
Word embeddings trained on large corpora have shown to encode high levels of unfair discriminatory gender, racial, religious and ethnic biases. In contrast, human-written dictionaries describe the meanings of words in a concise, objective and an unbiased manner.
Masahiro Kaneko, Danushka Bollegala
arxiv +3 more sources
Word embeddings are a widely used set of natural language processing techniques that map words to vectors of real numbers. These vectors are used to improve the quality of generative and predictive models.
Orestis Papakyriakopoulos+3 more
semanticscholar +2 more sources
Comparing general and specialized word embeddings for biomedical named entity recognition [PDF]
Increased interest in the use of word embeddings, such as word representation, for biomedical named entity recognition (BioNER) has highlighted the need for evaluations that aid in selecting the best word embedding to be used.
Rigo E. Ramos-Vargas+2 more
doaj +3 more sources
Refined Global Word Embeddings Based on Sentiment Concept for Sentiment Analysis
Sentiment Analysis is an important research direction of natural language processing, and it is widely used in politics, news and other fields. Word embeddings play a significant role in sentiment analysis.
Yabing Wang+5 more
doaj +2 more sources
Training and intrinsic evaluation of lightweight word embeddings for the clinical domain in Spanish [PDF]
Resources for Natural Language Processing (NLP) are less numerous for languages different from English. In the clinical domain, where these resources are vital for obtaining new knowledge about human health and diseases, creating new resources for the ...
Carolina Chiu+10 more
doaj +2 more sources