Results 1 to 10 of about 254,842 (278)
Improved Arabic query expansion using word embedding [PDF]
Word embedding enhances pseudo-relevance feedback query expansion (PRFQE), but training word embedding models takes a long time and is applied to large datasets.
Yaser A. Al-Lahham +3 more
doaj +2 more sources
Dynamic Contextualized Word Embeddings [PDF]
Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and ...
Hofmann, V +2 more
openaire +2 more sources
Sentiment Classification Performance Analysis Based on Glove Word Embedding
Representation of words in mathematical expressions is an essential issue in natural language processing. In this study, data sets in different categories are classified as positive or negative according to their content.
Yasin Kırelli, Şebnem Özdemir
doaj +1 more source
Research of BERT Cross-Lingual Word Embedding Learning
With the development of multilingual information on the Internet, how to effectively represent the infor-mation contained in different language texts has become an important sub-task of natural language information processing.
WANG Yurong, LIN Min, LI Yanling
doaj +1 more source
Morphological Word-Embeddings [PDF]
Published at NAACL ...
Cotterell, Ryan, Schütze, Hinrich
openaire +2 more sources
Relational Word Embeddings [PDF]
While word embeddings have been shown to implicitly encode various forms of attributional knowledge, the extent to which they capture relational information is far more limited. In previous work, this limitation has been addressed by incorporating relational knowledge from external knowledge bases when learning the word embedding.
Camacho Collados, Jose +2 more
openaire +3 more sources
Mirroring Vector Space Embedding for New Words
Most embedding models used in natural language processing require retraining of the entire model to obtain the embedding value of a new word. In the current system, as retraining is repeated, the amount of data used for learning gradually increases.
Jihye Kim, Ok-Ran Jeong
doaj +1 more source
Comparing general and specialized word embeddings for biomedical named entity recognition [PDF]
Increased interest in the use of word embeddings, such as word representation, for biomedical named entity recognition (BioNER) has highlighted the need for evaluations that aid in selecting the best word embedding to be used.
Rigo E. Ramos-Vargas +2 more
doaj +2 more sources
Semantic Role Labeling for Amharic Text Using Multiple Embeddings and Deep Neural Network
Amharic is morphologically complex and under-resourced language, posing difficulties in the development of natural language processing applications.
Bemnet Meresa Hailu +2 more
doaj +1 more source
A word embedding trained on South African news data
This article presents results from a study that developed and tested a word embedding trained on a dataset of South African news articles. A word embedding is an algorithm-generated word representation that can be used to analyse the corpus of words ...
Martin Canaan Mafunda +3 more
doaj +1 more source

