Results 21 to 30 of about 259,269 (327)

Contextual Word Embedding [PDF]

open access: yesCompanion of the The Web Conference 2018 on The Web Conference 2018 - WWW '18, 2018
Effective clustering of short documents, such as tweets, is difficult because of the lack of sufficient semantic context. Word embedding is a technique that is effective in addressing this lack of semantic context. However, the process of word vector embedding, in turn, relies on the availability of sufficient contexts to learn the word associations ...
Debasis Ganguly, Kripabandhu Ghosh
openaire   +1 more source

NEGATIVE-SAMPLING WORD-EMBEDDING METHOD

open access: yesScientific Journal of Astana IT University, 2022
One of the most famous authors of the method is Tomas Mikolov. His software and method of theoretical application are the major ones for our consideration today. It is better to pay attention that it is more mathematically oriented.
Madina Bokan
doaj   +1 more source

Word Embedding Methods in Natural Language Processing: a Review [PDF]

open access: yesJisuanji kexue yu tansuo
Word embedding, as the first step in natural language processing (NLP) tasks, aims to transform input natural language text into numerical vectors, known as word vectors or distributed representations, which artificial intelligence models can process ...
ZENG Jun, WANG Ziwei, YU Yang, WEN Junhao, GAO Min
doaj   +1 more source

Task-Optimized Word Embeddings for Text Classification Representations

open access: yesFrontiers in Applied Mathematics and Statistics, 2020
Word embeddings have introduced a compact and efficient way of representing text for further downstream natural language processing (NLP) tasks. Most word embedding algorithms are optimized at the word level.
Sukrat Gupta   +3 more
doaj   +1 more source

Improve word embedding using both writing and pronunciation. [PDF]

open access: yesPLoS ONE, 2018
Text representation can map text into a vector space for subsequent use in numerical calculations and processing tasks. Word embedding is an important component of text representation.
Wenhao Zhu   +4 more
doaj   +1 more source

Morpheme Embedding for Bahasa Indonesia Using Modified Byte Pair Encoding

open access: yesIEEE Access, 2021
Word embedding is an efficient feature representation that carries semantic and syntactic information. Word embedding works as a word level that treats words as minor independent entity units and cannot handle words that are not in the training corpus ...
Amalia Amalia   +3 more
doaj   +1 more source

Attention Word Embedding [PDF]

open access: yesProceedings of the 28th International Conference on Computational Linguistics, 2020
Word embedding models learn semantically rich vector representations of words and are widely used to initialize natural processing language (NLP) models. The popular continuous bag-of-words (CBOW) model of word2vec learns a vector embedding by masking a given word in a sentence and then using the other words as a context to predict it.
Sonkar, Shashank   +2 more
openaire   +2 more sources

Approximating Word Ranking and Negative Sampling for Word Embedding [PDF]

open access: yes, 2018
CBOW (Continuous Bag-Of-Words) is one of the most commonly used techniques to generate word embeddings in various NLP tasks. However, it fails to reach the optimal performance due to uniform involvements of positive words and a simple sampling ...
Guo, Guibing   +3 more
core   +1 more source

Experiential, Distributional and Dependency-based Word Embeddings have Complementary Roles in Decoding Brain Activity [PDF]

open access: yes, 2017
We evaluate 8 different word embedding models on their usefulness for predicting the neural activation patterns associated with concrete nouns. The models we consider include an experiential model, based on crowd-sourced association data, several popular
Abnar, Samira   +3 more
core   +3 more sources

Skip-Gram-KR: Korean Word Embedding for Semantic Clustering

open access: yesIEEE Access, 2019
Deep learning algorithms are used in various applications for pattern recognition, natural language processing, speech recognition, and so on. Recently, neural network-based natural language processing techniques use fixed length word embedding.
Sun-Young Ihm, Ji-Hye Lee, Young-Ho Park
doaj   +1 more source

Home - About - Disclaimer - Privacy