Results 21 to 30 of about 96,400 (324)
Learning linear transformations between counting-based and prediction-based word embeddings. [PDF]
Despite the growing interest in prediction-based word embedding learning methods, it remains unclear as to how the vector spaces learnt by the prediction-based methods differ from that of the counting-based methods, or whether one can be transformed into
Danushka Bollegala +2 more
doaj +1 more source
Background In the past few years, neural word embeddings have been widely used in text mining. However, the vector representations of word embeddings mostly act as a black box in downstream applications using them, thereby limiting their interpretability.
Zhiwei Chen +3 more
doaj +1 more source
Multi-sense Embeddings Using Synonym Sets and Hypernym Information from Wordnet.
Word embedding approaches increased the efficiency of natural language processing (NLP) tasks. Traditional word embeddings though robust for many NLP activities, do not handle polysemy of words.
Krishna Siva Prasad Mudigonda +1 more
doaj +1 more source
Acoustic Word Embeddings for End-to-End Speech Synthesis
The most recent end-to-end speech synthesis systems use phonemes as acoustic input tokens and ignore the information about which word the phonemes come from.
Feiyu Shen, Chenpeng Du, Kai Yu
doaj +1 more source
Evaluating the Underlying Gender Bias in Contextualized Word Embeddings [PDF]
Gender bias is highly impacting natural language processing applications. Word embeddings have clearly been proven both to keep and amplify gender biases that are present in current data sources.
Basta, Christine +2 more
core +2 more sources
Contextual Word Embedding [PDF]
Effective clustering of short documents, such as tweets, is difficult because of the lack of sufficient semantic context. Word embedding is a technique that is effective in addressing this lack of semantic context. However, the process of word vector embedding, in turn, relies on the availability of sufficient contexts to learn the word associations ...
Debasis Ganguly, Kripabandhu Ghosh
openaire +1 more source
This work explores the use of word embeddings as features for Spanish verb sense disambiguation (VSD). This type of learning technique is named disjoint semisupervised learning: an unsupervised algorithm (i.e.
Cristian Cardellino +1 more
doaj +1 more source
Using Word Embeddings in Twitter Election Classification [PDF]
Word embeddings and convolutional neural networks (CNN) have attracted extensive attention in various classification tasks for Twitter, e.g. sentiment classification.
Macdonald, Craig +2 more
core +2 more sources
Attention Word Embedding [PDF]
Word embedding models learn semantically rich vector representations of words and are widely used to initialize natural processing language (NLP) models. The popular continuous bag-of-words (CBOW) model of word2vec learns a vector embedding by masking a given word in a sentence and then using the other words as a context to predict it.
Sonkar, Shashank +2 more
openaire +2 more sources
Enhancing Accuracy of Semantic Relatedness Measurement by Word Single-Meaning Embeddings
We propose a lightweight algorithm of learning word single-meaning embeddings (WSME), by exploring WordNet synsets and Doc2vec document embeddings, to enhance the accuracy of semantic relatedness measurement.
Xiaotao Li, Shujuan You, Wai Chen
doaj +1 more source

