Results 11 to 20 of about 254,842 (278)
Social Media Topic Recognition Based on Word Embedding and Probabilistic Topic Model [PDF]
Word embedding can capture the semantic information of words from the large corpus,and its combination with the probabilistic topic model can solve the problem of lack of semantic information in the standard topic model.So in this paper,Word-Topic ...
YU Chong,LI Jing,SUN Xudong,FU Xianghua
doaj +1 more source
DAWE: A Double Attention-Based Word Embedding Model with Sememe Structure Information
Word embedding is an important reference for natural language processing tasks, which can generate distribution presentations of words based on many text data.
Shengwen Li +5 more
doaj +1 more source
TWE‐WSD: An effective topical word embedding based word sense disambiguation
Word embedding has been widely used in word sense disambiguation (WSD) and many other tasks in recent years for it can well represent the semantics of words.
Lianyin Jia +5 more
doaj +1 more source
Creating Welsh Language Word Embeddings [PDF]
Word embeddings are representations of words in a vector space that models semantic relationships between words by means of distance and direction. In this study, we adapted two existing methods, word2vec and fastText, to automatically learn Welsh word embeddings taking into account syntactic and morphological idiosyncrasies of this language.
Padraig Corcoran +4 more
openaire +3 more sources
Name Entity Recognition for Military Based on Domain Adaptive Embedding [PDF]
In order to solve the poor quality problem of domain embedding space caused by inadequate military corpus which makes low accuracy of applying deep neural network model to military named entity recognition,this paper introduces a domain adaptive method ...
LIU Kai, ZHANG Hong-jun, CHEN Fei-qiong
doaj +1 more source
Sentence model based subword embeddings for a dialog system
This study focuses on improving a word embedding model to enhance the performance of downstream tasks, such as those of dialog systems. To improve traditional word embedding models, such as skip-gram, it is critical to refine the word features and expand
Euisok Chung +2 more
doaj +1 more source
Joint Fine-Grained Components Continuously Enhance Chinese Word Embeddings
The most common method of word embedding is to learn word vector representations from context information of large-scale text. However, Chinese words usually consist of characters, subcharacters, and strokes, and each part contains rich semantic ...
Chengyang Zhuang +3 more
doaj +1 more source
Evaluating the Underlying Gender Bias in Contextualized Word Embeddings [PDF]
Gender bias is highly impacting natural language processing applications. Word embeddings have clearly been proven both to keep and amplify gender biases that are present in current data sources.
Basta, Christine +2 more
core +2 more sources
NEGATIVE-SAMPLING WORD-EMBEDDING METHOD
One of the most famous authors of the method is Tomas Mikolov. His software and method of theoretical application are the major ones for our consideration today. It is better to pay attention that it is more mathematically oriented.
Madina Bokan
doaj +1 more source
Contextual Word Embedding [PDF]
Effective clustering of short documents, such as tweets, is difficult because of the lack of sufficient semantic context. Word embedding is a technique that is effective in addressing this lack of semantic context. However, the process of word vector embedding, in turn, relies on the availability of sufficient contexts to learn the word associations ...
Debasis Ganguly, Kripabandhu Ghosh
openaire +1 more source

