Results 41 to 50 of about 256,478 (282)

Comparative Analysis of Using Word Embedding in Deep Learning for Text Classification

open access: yesJurnal Riset Informatika, 2023
A group of theory-driven computing techniques known as natural language processing (NLP) are used to interpret and represent human discourse automatically.
Mukhamad Rizal Ilham, Arif Dwi Laksito
doaj   +1 more source

Data Sets: Word Embeddings Learned from Tweets and General Data

open access: yes, 2017
A word embedding is a low-dimensional, dense and real- valued vector representation of a word. Word embeddings have been used in many NLP tasks. They are usually gener- ated from a large text corpus.
Li, Quanzhi   +3 more
core   +2 more sources

Word Embedding With Zipf’s Context

open access: yesIEEE Access, 2019
Word embeddings generated by neural language models have achieved great success in many NLP tasks. However, neural language models may be difficult to train and time consuming.
Lizheng Gao   +3 more
doaj   +1 more source

Unsupervised Word Embedding Learning by Incorporating Local and Global Contexts

open access: yesFrontiers in Big Data, 2020
Word embedding has benefited a broad spectrum of text analysis tasks by learning distributed word representations to encode word semantics. Word representations are typically learned by modeling local contexts of words, assuming that words sharing ...
Yu Meng   +5 more
doaj   +1 more source

Relevance-based Word Embedding

open access: yes, 2017
Learning a high-dimensional dense representation for vocabulary terms, also known as a word embedding, has recently attracted much attention in natural language processing and information retrieval tasks. The embedding vectors are typically learned based
Ai Qingyao   +15 more
core   +1 more source

Using Word Embeddings in Twitter Election Classification [PDF]

open access: yes, 2016
Word embeddings and convolutional neural networks (CNN) have attracted extensive attention in various classification tasks for Twitter, e.g. sentiment classification.
Macdonald, Craig   +2 more
core   +2 more sources

Citation Intent Classification Using Word Embedding

open access: yesIEEE Access, 2021
Citation analysis is an active area of research for various reasons. So far, statistical approaches are mainly used for citation analysis, which does not look into the internal context of the citations.
Muhammad Roman   +4 more
doaj   +1 more source

Gloss Alignment using Word Embeddings

open access: yes2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW), 2023
Capturing and annotating Sign language datasets is a time consuming and costly process. Current datasets are orders of magnitude too small to successfully train unconstrained \acf{slt} models. As a result, research has turned to TV broadcast content as a source of large-scale training data, consisting of both the sign language interpreter and the ...
Walsh, Harry   +3 more
openaire   +2 more sources

Improving Word Embedding Using Variational Dropout

open access: yesProceedings of the International Florida Artificial Intelligence Research Society Conference, 2023
Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings.
Zainab Albujasim   +3 more
doaj   +1 more source

Evaluating Word Embeddings in Multi-label Classification Using Fine-grained Name Typing

open access: yes, 2018
Embedding models typically associate each word with a single real-valued vector, representing its different properties. Evaluation methods, therefore, need to analyze the accuracy and completeness of these properties in embeddings.
Kann, Katharina   +2 more
core   +1 more source

Home - About - Disclaimer - Privacy