Results 31 to 40 of about 109,902 (144)
Blind signal decomposition of various word embeddings based on join and individual variance explained [PDF]
In recent years, natural language processing (NLP) has become one of the most important areas with various applications in human's life. As the most fundamental task, the field of word embedding still requires more attention and research. Currently, existing works about word embedding are focusing on proposing novel embedding algorithms and dimension ...
arxiv
Enhancing Ultrasound Power Transfer: Efficiency, Acoustics, and Future Directions
Implantable devices significantly enhance healthcare but are limited by battery life. Ultrasound power transfer technology offers a promising solution for sustainable operation. This review addresses gaps in current research, particularly in sound field analysis and energy efficiency optimization.
Yi Zheng+6 more
wiley +1 more source
Neural-based Noise Filtering from Word Embeddings [PDF]
Word embeddings have been demonstrated to benefit NLP tasks impressively. Yet, there is room for improvement in the vector representations, because current word embeddings typically contain unnecessary information, i.e., noise. We propose two novel models to improve word embeddings by unsupervised learning, in order to yield word denoising embeddings ...
arxiv
Optoelectronic Devices for In‐Sensor Computing
The raw data obtained directly from sensors in the noisy analogue domain is often unstructured, which lacks a predefined format or organization and does not conform to a specific data model. Optoelectronic devices for in‐sensor visual processing can integrate perception, memory, and processing functions in the same physical units, which can compress ...
Qinqi Ren+7 more
wiley +1 more source
Exploration on Grounded Word Embedding: Matching Words and Images with Image-Enhanced Skip-Gram Model [PDF]
Word embedding is designed to represent the semantic meaning of a word with low dimensional vectors. The state-of-the-art methods of learning word embeddings (word2vec and GloVe) only use the word co-occurrence information. The learned embeddings are real number vectors, which are obscure to human.
arxiv
In this review, the recent development of deep‐blue (≤465 nm) perovskite light‐emitting diodes (PeLEDs) are summarized, using different perovskite nanomaterials, including nanocrystals (NCs), quantum dots (QDs), nanoplatelets (NPLs), quasi‐2D thin film, 3D bulk thin film, as well as lead‐free perovskite nanomaterials.
Pui Kei Ko+6 more
wiley +1 more source
Identity-sensitive Word Embedding through Heterogeneous Networks [PDF]
Most existing word embedding approaches do not distinguish the same words in different contexts, therefore ignoring their contextual meanings. As a result, the learned embeddings of these words are usually a mixture of multiple meanings. In this paper, we acknowledge multiple identities of the same word in different contexts and learn the \textbf ...
arxiv
Towards a Theoretical Understanding of Word and Relation Representation [PDF]
Representing words by vectors, or embeddings, enables computational reasoning and is foundational to automating natural language tasks. For example, if word embeddings of similar words contain similar values, word similarity can be readily assessed, whereas judging that from their spelling is often impossible (e.g.
arxiv
Pushing Radiative Cooling Technology to Real Applications
Radiative cooling controls surface optical properties for solar and thermal radiation, offering solutions for global warming and energy savings. Despite significant advances, key challenges remain: optimizing optical efficiency, maintaining aesthetics, preventing overcooling, enhancing durability, and enabling scalable production.
Chongjia Lin+8 more
wiley +1 more source
Evaluating the Underlying Gender Bias in Contextualized Word Embeddings [PDF]
Gender bias is highly impacting natural language processing applications. Word embeddings have clearly been proven both to keep and amplify gender biases that are present in current data sources. Recently, contextualized word embeddings have enhanced previous word embedding techniques by computing word vector representations dependent on the sentence ...
arxiv