Results 1 to 10 of about 259,269 (327)

Data Sets: Word Embeddings Learned from Tweets and General Data

open access: diamond, 2017
A word embedding is a low-dimensional, dense and real- valued vector representation of a word. Word embeddings have been used in many NLP tasks. They are usually gener- ated from a large text corpus.
Li, Quanzhi   +3 more
core   +3 more sources

Sentiment Classification Performance Analysis Based on Glove Word Embedding

open access: yesSakarya Üniversitesi Fen Bilimleri Enstitüsü Dergisi, 2021
Representation of words in mathematical expressions is an essential issue in natural language processing. In this study, data sets in different categories are classified as positive or negative according to their content.
Yasin Kırelli, Şebnem Özdemir
doaj   +1 more source

Research of BERT Cross-Lingual Word Embedding Learning

open access: yesJisuanji kexue yu tansuo, 2021
With the development of multilingual information on the Internet, how to effectively represent the infor-mation contained in different language texts has become an important sub-task of natural language information processing.
WANG Yurong, LIN Min, LI Yanling
doaj   +1 more source

Mirroring Vector Space Embedding for New Words

open access: yesIEEE Access, 2021
Most embedding models used in natural language processing require retraining of the entire model to obtain the embedding value of a new word. In the current system, as retraining is repeated, the amount of data used for learning gradually increases.
Jihye Kim, Ok-Ran Jeong
doaj   +1 more source

Comparing general and specialized word embeddings for biomedical named entity recognition [PDF]

open access: yesPeerJ Computer Science, 2021
Increased interest in the use of word embeddings, such as word representation, for biomedical named entity recognition (BioNER) has highlighted the need for evaluations that aid in selecting the best word embedding to be used.
Rigo E. Ramos-Vargas   +2 more
doaj   +2 more sources

Semantic Role Labeling for Amharic Text Using Multiple Embeddings and Deep Neural Network

open access: yesIEEE Access, 2023
Amharic is morphologically complex and under-resourced language, posing difficulties in the development of natural language processing applications.
Bemnet Meresa Hailu   +2 more
doaj   +1 more source

A word embedding trained on South African news data

open access: yesThe African Journal of Information and Communication, 2022
This article presents results from a study that developed and tested a word embedding trained on a dataset of South African news articles. A word embedding is an algorithm-generated word representation that can be used to analyse the corpus of words ...
Martin Canaan Mafunda   +3 more
doaj   +1 more source

Social Media Topic Recognition Based on Word Embedding and Probabilistic Topic Model [PDF]

open access: yesJisuanji gongcheng, 2017
Word embedding can capture the semantic information of words from the large corpus,and its combination with the probabilistic topic model can solve the problem of lack of semantic information in the standard topic model.So in this paper,Word-Topic ...
YU Chong,LI Jing,SUN Xudong,FU Xianghua
doaj   +1 more source

DAWE: A Double Attention-Based Word Embedding Model with Sememe Structure Information

open access: yesApplied Sciences, 2020
Word embedding is an important reference for natural language processing tasks, which can generate distribution presentations of words based on many text data.
Shengwen Li   +5 more
doaj   +1 more source

TWE‐WSD: An effective topical word embedding based word sense disambiguation

open access: yesCAAI Transactions on Intelligence Technology, 2021
Word embedding has been widely used in word sense disambiguation (WSD) and many other tasks in recent years for it can well represent the semantics of words.
Lianyin Jia   +5 more
doaj   +1 more source

Home - About - Disclaimer - Privacy