Results 1 to 10 of about 256,379 (183)
Sentiment-Aware Word Embedding for Emotion Classification
Word embeddings are effective intermediate representations for capturing semantic regularities between words in natural language processing (NLP) tasks. We propose sentiment-aware word embedding for emotional classification, which consists of integrating
Xingliang Mao +4 more
doaj +3 more sources
Impact of word embedding models on text analytics in deep learning environment: a review [PDF]
Deepak Asudani, Nknagwani, Pradeep Singh
exaly +2 more sources
Dynamic Contextualized Word Embeddings [PDF]
Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and ...
Hofmann, V +2 more
openaire +2 more sources
Sentiment Classification Performance Analysis Based on Glove Word Embedding
Representation of words in mathematical expressions is an essential issue in natural language processing. In this study, data sets in different categories are classified as positive or negative according to their content.
Yasin Kırelli, Şebnem Özdemir
doaj +1 more source
Research of BERT Cross-Lingual Word Embedding Learning
With the development of multilingual information on the Internet, how to effectively represent the infor-mation contained in different language texts has become an important sub-task of natural language information processing.
WANG Yurong, LIN Min, LI Yanling
doaj +1 more source
Mirroring Vector Space Embedding for New Words
Most embedding models used in natural language processing require retraining of the entire model to obtain the embedding value of a new word. In the current system, as retraining is repeated, the amount of data used for learning gradually increases.
Jihye Kim, Ok-Ran Jeong
doaj +1 more source
Morphological Word-Embeddings [PDF]
Published at NAACL ...
Cotterell, Ryan, Schütze, Hinrich
openaire +2 more sources
Relational Word Embeddings [PDF]
While word embeddings have been shown to implicitly encode various forms of attributional knowledge, the extent to which they capture relational information is far more limited. In previous work, this limitation has been addressed by incorporating relational knowledge from external knowledge bases when learning the word embedding.
Camacho Collados, Jose +2 more
openaire +3 more sources
Comparing general and specialized word embeddings for biomedical named entity recognition [PDF]
Increased interest in the use of word embeddings, such as word representation, for biomedical named entity recognition (BioNER) has highlighted the need for evaluations that aid in selecting the best word embedding to be used.
Rigo E. Ramos-Vargas +2 more
doaj +2 more sources
Semantic Role Labeling for Amharic Text Using Multiple Embeddings and Deep Neural Network
Amharic is morphologically complex and under-resourced language, posing difficulties in the development of natural language processing applications.
Bemnet Meresa Hailu +2 more
doaj +1 more source

