Results 41 to 50 of about 259,269 (327)

Comparative Analysis of Using Word Embedding in Deep Learning for Text Classification

open access: yesJurnal Riset Informatika, 2023
A group of theory-driven computing techniques known as natural language processing (NLP) are used to interpret and represent human discourse automatically.
Mukhamad Rizal Ilham, Arif Dwi Laksito
doaj   +1 more source

Unsupervised Word Embedding Learning by Incorporating Local and Global Contexts

open access: yesFrontiers in Big Data, 2020
Word embedding has benefited a broad spectrum of text analysis tasks by learning distributed word representations to encode word semantics. Word representations are typically learned by modeling local contexts of words, assuming that words sharing ...
Yu Meng   +5 more
doaj   +1 more source

Word Embedding With Zipf’s Context

open access: yesIEEE Access, 2019
Word embeddings generated by neural language models have achieved great success in many NLP tasks. However, neural language models may be difficult to train and time consuming.
Lizheng Gao   +3 more
doaj   +1 more source

Comparative Analysis of Word Embeddings for Capturing Word Similarities

open access: yes, 2020
Distributed language representation has become the most widely used technique for language representation in various natural language processing tasks. Most of the natural language processing models that are based on deep learning techniques use already ...
Kalajdjieski, Jovan   +2 more
core   +1 more source

Citation Intent Classification Using Word Embedding

open access: yesIEEE Access, 2021
Citation analysis is an active area of research for various reasons. So far, statistical approaches are mainly used for citation analysis, which does not look into the internal context of the citations.
Muhammad Roman   +4 more
doaj   +1 more source

Factors Influencing the Surprising Instability of Word Embeddings

open access: yes, 2018
Despite the recent popularity of word embedding methods, there is only a small body of work exploring the limitations of these representations. In this paper, we consider one aspect of embedding spaces, namely their stability.
Kummerfeld, Jonathan K.   +2 more
core   +1 more source

Gloss Alignment using Word Embeddings

open access: yes2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW), 2023
Capturing and annotating Sign language datasets is a time consuming and costly process. Current datasets are orders of magnitude too small to successfully train unconstrained \acf{slt} models. As a result, research has turned to TV broadcast content as a source of large-scale training data, consisting of both the sign language interpreter and the ...
Walsh, Harry   +3 more
openaire   +2 more sources

Chinese Event Extraction Based on Attention and Semantic Features: A Bidirectional Circular Neural Network

open access: yesFuture Internet, 2018
Chinese event extraction uses word embedding to capture similarity, but suffers when handling previously unseen or rare words. From the test, we know that characters may provide some information that we cannot obtain in words, so we propose a novel ...
Yue Wu, Junyi Zhang
doaj   +1 more source

A Smaller and Better Word Embedding for Neural Machine Translation

open access: yesIEEE Access, 2023
Word embeddings play an important role in Neural Machine Translation (NMT). However, it still has a series of problems such as ignoring the prior knowledge of the association between words, relying on specific task constraints passively in parameter ...
Qi Chen
doaj   +1 more source

Imparting Interpretability to Word Embeddings while Preserving Semantic Structure

open access: yes, 2020
As an ubiquitous method in natural language processing, word embeddings are extensively employed to map semantic properties of words into a dense vector representation.
Koç, Aykut   +4 more
core   +2 more sources

Home - About - Disclaimer - Privacy