Results 121 to 130 of about 2,031,469 (352)

Learning Word Sense Embeddings from Word Sense Definitions [PDF]

open access: yesarXiv, 2016
Word embeddings play a significant role in many modern NLP systems. Since learning one representation per word is problematic for polysemous words and homonymous words, researchers propose to use one embedding per word sense. Their approaches mainly train word sense embeddings on a corpus.
arxiv  

Blind signal decomposition of various word embeddings based on join and individual variance explained [PDF]

open access: yesarXiv, 2020
In recent years, natural language processing (NLP) has become one of the most important areas with various applications in human's life. As the most fundamental task, the field of word embedding still requires more attention and research. Currently, existing works about word embedding are focusing on proposing novel embedding algorithms and dimension ...
arxiv  

Engineering the Future of Restorative Clinical Peripheral Nerve Surgery

open access: yesAdvanced Healthcare Materials, EarlyView.
What if damaged nerves could regenerate more effectively? This review unveils cutting‐edge strategies to restore nerve function, from biomaterial scaffolds and bioactive molecules to living engineered tissues. By accelerating axonal regrowth, preserving Schwann cells, and enhancing connectivity, these approaches are reshaping nerve repair—offering new ...
Justin C. Burrell   +5 more
wiley   +1 more source

Neural-based Noise Filtering from Word Embeddings [PDF]

open access: yesarXiv, 2016
Word embeddings have been demonstrated to benefit NLP tasks impressively. Yet, there is room for improvement in the vector representations, because current word embeddings typically contain unnecessary information, i.e., noise. We propose two novel models to improve word embeddings by unsupervised learning, in order to yield word denoising embeddings ...
arxiv  

Enhancing Ultrasound Power Transfer: Efficiency, Acoustics, and Future Directions

open access: yesAdvanced Materials, EarlyView.
Implantable devices significantly enhance healthcare but are limited by battery life. Ultrasound power transfer technology offers a promising solution for sustainable operation. This review addresses gaps in current research, particularly in sound field analysis and energy efficiency optimization.
Yi Zheng   +6 more
wiley   +1 more source

Task-Optimized Word Embeddings for Text Classification Representations

open access: yesFrontiers in Applied Mathematics and Statistics, 2020
Word embeddings have introduced a compact and efficient way of representing text for further downstream natural language processing (NLP) tasks. Most word embedding algorithms are optimized at the word level.
Sukrat Gupta   +3 more
doaj   +1 more source

Exploration on Grounded Word Embedding: Matching Words and Images with Image-Enhanced Skip-Gram Model [PDF]

open access: yesarXiv, 2018
Word embedding is designed to represent the semantic meaning of a word with low dimensional vectors. The state-of-the-art methods of learning word embeddings (word2vec and GloVe) only use the word co-occurrence information. The learned embeddings are real number vectors, which are obscure to human.
arxiv  

Optoelectronic Devices for In‐Sensor Computing

open access: yesAdvanced Materials, EarlyView.
The raw data obtained directly from sensors in the noisy analogue domain is often unstructured, which lacks a predefined format or organization and does not conform to a specific data model. Optoelectronic devices for in‐sensor visual processing can integrate perception, memory, and processing functions in the same physical units, which can compress ...
Qinqi Ren   +7 more
wiley   +1 more source

Identity-sensitive Word Embedding through Heterogeneous Networks [PDF]

open access: yesarXiv, 2016
Most existing word embedding approaches do not distinguish the same words in different contexts, therefore ignoring their contextual meanings. As a result, the learned embeddings of these words are usually a mixture of multiple meanings. In this paper, we acknowledge multiple identities of the same word in different contexts and learn the \textbf ...
arxiv  

Towards a Theoretical Understanding of Word and Relation Representation [PDF]

open access: yesarXiv, 2022
Representing words by vectors, or embeddings, enables computational reasoning and is foundational to automating natural language tasks. For example, if word embeddings of similar words contain similar values, word similarity can be readily assessed, whereas judging that from their spelling is often impossible (e.g.
arxiv  

Home - About - Disclaimer - Privacy