Results 291 to 300 of about 259,269 (327)
A wood‐form‐stable phase change composite featuring a tensile strength of 134.42 MPa, zero leakage under load, and a phase change enthalpy of 94.73 J g−1 is developed through a structural biomimicry and dual‐network reinforcement strategy. With its exceptional shape stability and remarkable phase change performance, this phase change composite is well ...
Ya Zhou +5 more
wiley +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
2021 Conference on Information Communications Technology and Society (ICTAS), 2021
Word embeddings are currently the most popular vector space model in Natural Language Processing. How we encode words is important because it affects the performance of many downstream tasks such as Machine Translation (MT), Information Retrieval (IR) and Automatic Speech Recognition (ASR).
Sibonelo Dlamini +3 more
openaire +1 more source
Word embeddings are currently the most popular vector space model in Natural Language Processing. How we encode words is important because it affects the performance of many downstream tasks such as Machine Translation (MT), Information Retrieval (IR) and Automatic Speech Recognition (ASR).
Sibonelo Dlamini +3 more
openaire +1 more source
Proceedings of the AAAI Conference on Artificial Intelligence, 2015
Most word embedding models typically represent each word using a single vector, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance discriminativeness, we employ latent topic models to assign topics for each word in the text corpus, and learn topical word embeddings (TWE) based on both ...
Yang Liu +3 more
openaire +1 more source
Most word embedding models typically represent each word using a single vector, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance discriminativeness, we employ latent topic models to assign topics for each word in the text corpus, and learn topical word embeddings (TWE) based on both ...
Yang Liu +3 more
openaire +1 more source
Abstract This chapter deals with the mathematical representation of words through vectors or embeddings which are the basis of modern language models. It starts by discussing the limits of the one-hot representation and continues with a section that presents traditional approaches based on the factorization of the word co-occurrence ...
Christophe Gaillac, Jérémy L'Hour
openaire +2 more sources
Christophe Gaillac, Jérémy L'Hour
openaire +2 more sources
Sentiment Analysis with Word Embedding
2018 IEEE 7th International Conference on Adaptive Science & Technology (ICAST), 2018The basic task of sentiment analysis is to determine the sentiment polarity (positivity, neutrality or negativity) of a piece text. The traditional bag-of-words models deficiencies affect the accuracy of sentiment classifications. The purpose of this study is to improve the accuracy of the sentiment classification by employing the concept of word ...
B. Oscar Deho +3 more
openaire +2 more sources
2022
Musical Word Embedding for Music Tagging and Retrieval IEEE Transactions on Audio, Speech and Language Processing (submitted) - SeungHeon Doh, Jongpil Lee, Dasaem Jeong, Juhan NamDEMO: https://seungheondoh.github.io/musical_word_embedding_demo/ Word embedding has become an essential means for text-based information retrieval.
openaire +1 more source
Musical Word Embedding for Music Tagging and Retrieval IEEE Transactions on Audio, Speech and Language Processing (submitted) - SeungHeon Doh, Jongpil Lee, Dasaem Jeong, Juhan NamDEMO: https://seungheondoh.github.io/musical_word_embedding_demo/ Word embedding has become an essential means for text-based information retrieval.
openaire +1 more source
Word Embeddings are Word Story Embeddings (and That's Fine)
2022Katrin Erk, Gabriella Chronis
openaire +1 more source

