Results 41 to 50 of about 96,400 (324)

Word Embeddings for Entity-annotated Texts

open access: yes, 2020
Learned vector representations of words are useful tools for many information retrieval and natural language processing tasks due to their ability to capture lexical semantics.
A Das   +15 more
core   +1 more source

Distilling Word Embeddings

open access: yesProceedings of the 25th ACM International on Conference on Information and Knowledge Management, 2016
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new research topic, as lightweight neural networks with high performance are particularly in need in various resource-restricted systems. This paper addresses the problem of distilling word embeddings for NLP tasks.
Mou, Lili   +5 more
openaire   +2 more sources

Reliable Classification of FAQs with Spelling Errors Using an Encoder-Decoder Neural Network in Korean

open access: yesApplied Sciences, 2019
To resolve lexical disagreement problems between queries and frequently asked questions (FAQs), we propose a reliable sentence classification model based on an encoder-decoder neural network.
Youngjin Jang, Harksoo Kim
doaj   +1 more source

SensEmbed: Learning sense embeddings for word and relational similarity [PDF]

open access: yes, 2015
Word embeddings have recently gained considerable popularity for modeling words in different Natural Language Processing (NLP) tasks including semantic similarity measurement.
IACOBACCI, IGNACIO JAVIER   +2 more
core   +2 more sources

Cultural Cartography with Word Embeddings [PDF]

open access: yesPoetics, 2020
Using the frequency of keywords is a classic approach in the formal analysis of text, but has the drawback of glossing over the relationality of word meanings. Word embedding models overcome this problem by constructing a standardized and continuous “meaning-space” where words are assigned a location based on relations of similarity to other words ...
Stoltz, Dustin, Taylor, Marshall
openaire   +4 more sources

Effects of Semantic Features on Machine Learning-Based Drug Name Recognition Systems: Word Embeddings vs. Manually Constructed Dictionaries

open access: yesInformation, 2015
Semantic features are very important for machine learning-based drug name recognition (DNR) systems. The semantic features used in most DNR systems are based on drug dictionaries manually constructed by experts.
Shengyu Liu   +3 more
doaj   +1 more source

Better Word Embeddings by Disentangling Contextual n-Gram Information

open access: yes, 2019
Pre-trained word vectors are ubiquitous in Natural Language Processing applications. In this paper, we show how training word embeddings jointly with bigram and even trigram embeddings, results in improved unigram embeddings.
Gupta, Prakhar   +2 more
core   +1 more source

Morphological Priors for Probabilistic Neural Word Embeddings

open access: yes, 2016
Word embeddings allow natural language processing systems to share statistical information across related words. These embeddings are typically based on distributional statistics, making it difficult for them to generalize to rare or unseen words.
Bhatia, Parminder   +2 more
core   +1 more source

Understanding and Creating Word Embeddings

open access: yesThe Programming Historian
Word embeddings allow you to analyze the usage of different terms in a corpus of texts by capturing information about their contextual usage. Through a primarily theoretical lens, this lesson will teach you how to prepare a corpus and train a word ...
Avery Blankenship   +2 more
doaj   +1 more source

Home - About - Disclaimer - Privacy