Results 251 to 260 of about 256,478 (282)
Some of the next articles are maybe not open access.
Proceedings of the AAAI Conference on Artificial Intelligence, 2015
Most word embedding models typically represent each word using a single vector, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance discriminativeness, we employ latent topic models to assign topics for each word in the text corpus, and learn topical word embeddings (TWE) based on both ...
Yang Liu +3 more
openaire +1 more source
Most word embedding models typically represent each word using a single vector, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance discriminativeness, we employ latent topic models to assign topics for each word in the text corpus, and learn topical word embeddings (TWE) based on both ...
Yang Liu +3 more
openaire +1 more source
Abstract This chapter deals with the mathematical representation of words through vectors or embeddings which are the basis of modern language models. It starts by discussing the limits of the one-hot representation and continues with a section that presents traditional approaches based on the factorization of the word co-occurrence ...
Christophe Gaillac, Jérémy L'Hour
openaire +2 more sources
Christophe Gaillac, Jérémy L'Hour
openaire +2 more sources
Sentiment Analysis with Word Embedding
2018 IEEE 7th International Conference on Adaptive Science & Technology (ICAST), 2018The basic task of sentiment analysis is to determine the sentiment polarity (positivity, neutrality or negativity) of a piece text. The traditional bag-of-words models deficiencies affect the accuracy of sentiment classifications. The purpose of this study is to improve the accuracy of the sentiment classification by employing the concept of word ...
B. Oscar Deho +3 more
openaire +2 more sources
2022
Musical Word Embedding for Music Tagging and Retrieval IEEE Transactions on Audio, Speech and Language Processing (submitted) - SeungHeon Doh, Jongpil Lee, Dasaem Jeong, Juhan NamDEMO: https://seungheondoh.github.io/musical_word_embedding_demo/ Word embedding has become an essential means for text-based information retrieval.
openaire +1 more source
Musical Word Embedding for Music Tagging and Retrieval IEEE Transactions on Audio, Speech and Language Processing (submitted) - SeungHeon Doh, Jongpil Lee, Dasaem Jeong, Juhan NamDEMO: https://seungheondoh.github.io/musical_word_embedding_demo/ Word embedding has become an essential means for text-based information retrieval.
openaire +1 more source
Word Embeddings are Word Story Embeddings (and That's Fine)
2022Katrin Erk, Gabriella Chronis
openaire +1 more source
Adaptive cross-contextual word embedding for word polysemy with unsupervised topic modeling
Knowledge-Based Systems, 2021Shuangyin Li
exaly
A Comparative Analysis of Word Embedding and Deep Learning for Arabic Sentiment Classification
Electronics (Switzerland), 2023Sahar F Sabbeh, Heba Fasihuddin
exaly
Task-specific dependency-based word embedding methods
Pattern Recognition Letters, 2022Chengwei Wei, Bin Wang, C -C Jay Kuo
exaly

