Results 31 to 40 of about 256,478 (282)
Punctuation and Parallel Corpus Based Word Embedding Model for Low-Resource Languages
To overcome the data sparseness in word embedding trained in low-resource languages, we propose a punctuation and parallel corpus based word embedding model.
Yang Yuan, Xiao Li, Ya-Ting Yang
doaj +1 more source
Approximating Word Ranking and Negative Sampling for Word Embedding [PDF]
CBOW (Continuous Bag-Of-Words) is one of the most commonly used techniques to generate word embeddings in various NLP tasks. However, it fails to reach the optimal performance due to uniform involvements of positive words and a simple sampling ...
Guo, Guibing +3 more
core +1 more source
Biomedical Word Sense Disambiguation with Word Embeddings [PDF]
There is a growing need for automatic extraction of information and knowledge from the increasing amount of biomedical and clinical data produced, namely in textual form. Natural language processing comes in this direction, helping in tasks such as information extraction and information retrieval.
Antunes, Rui, Matos, Sérgio
openaire +2 more sources
Bayesian estimation‐based sentiment word embedding model for sentiment analysis
Sentiment word embedding has been extensively studied and used in sentiment analysis tasks. However, most existing models have failed to differentiate high‐frequency and low‐frequency words.
Jingyao Tang +7 more
doaj +1 more source
A Polarity Capturing Sphere for Word to Vector Representation
Embedding words from a dictionary as vectors in a space has become an active research field, due to its many uses in several natural language processing applications.
Sandra Rizkallah +2 more
doaj +1 more source
Aspect Extraction of Case Microblog Based on Double Embedded Convolutional Neural Network [PDF]
Aspect extraction of the microblog involved in the case is a task in a specific domain.The expression of aspect words is diverse and the meaning is different from that of the general domain.Only relying on the word embedding in the general domain,these ...
WANG Xiao-han, TAN Chen-chen, XIANG Yan, YU Zheng-tao
doaj +1 more source
An Enhanced Neural Word Embedding Model for Transfer Learning
Due to the expansion of data generation, more and more natural language processing (NLP) tasks are needing to be solved. For this, word representation plays a vital role. Computation-based word embedding in various high languages is very useful. However,
Md. Kowsher +6 more
doaj +1 more source
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new research topic, as lightweight neural networks with high performance are particularly in need in various resource-restricted systems. This paper addresses the problem of distilling word embeddings for NLP tasks.
Mou, Lili +5 more
openaire +2 more sources
Do Multi-Sense Embeddings Improve Natural Language Understanding? [PDF]
Learning a distinct representation for each sense of an ambiguous word could lead to more powerful and fine-grained models of vector-space representations. Yet while `multi-sense' methods have been proposed and tested on artificial word-similarity tasks,
Jurafsky, Dan, Li, Jiwei
core +1 more source
Cultural Cartography with Word Embeddings [PDF]
Using the frequency of keywords is a classic approach in the formal analysis of text, but has the drawback of glossing over the relationality of word meanings. Word embedding models overcome this problem by constructing a standardized and continuous “meaning-space” where words are assigned a location based on relations of similarity to other words ...
Stoltz, Dustin, Taylor, Marshall
openaire +4 more sources

