Results 61 to 70 of about 254,842 (278)
Embedding Semantic Relations into Word Representations [PDF]
Learning representations for semantic relations is important for various tasks such as analogy detection, relational search, and relation classification.
Bollegala, Danushka +2 more
core +1 more source
Factors Influencing the Surprising Instability of Word Embeddings
Despite the recent popularity of word embedding methods, there is only a small body of work exploring the limitations of these representations. In this paper, we consider one aspect of embedding spaces, namely their stability.
Kummerfeld, Jonathan K. +2 more
core +1 more source
Freeze‐drying of layered silicate is the key to get coatings with superior gas barrier. Freeze‐drying of layered silicates modified with dodecylamine (DDA) is a highly effective technique for the preparation of barrier pigments that significantly mitigate the permeation of oxygen, water vapor, and hydrogen through polymer films containing these ...
Joshua Lommes +4 more
wiley +1 more source
What Do Large Language Models Know About Materials?
If large language models (LLMs) are to be used inside the material discovery and engineering process, they must be benchmarked for the accurateness of intrinsic material knowledge. The current work introduces 1) a reasoning process through the processing–structure–property–performance chain and 2) a tool for benchmarking knowledge of LLMs concerning ...
Adrian Ehrenhofer +2 more
wiley +1 more source
Toward the Development of Large-Scale Word Embedding for Low-Resourced Language
Word embedding is possessed by Natural language processing as a key procedure for semantically and syntactically manipulating the unlabeled text corpus.
Shahzad Nazir +5 more
doaj +1 more source
Compressing Word Embeddings [PDF]
10 pages, 0 figures, submitted to ICONIP-2016. Previous experimental results were submitted to ICLR-2016, but the paper has been significantly updated, since a new experimental set-up worked much ...
openaire +2 more sources
Relevance-based Word Embedding
Learning a high-dimensional dense representation for vocabulary terms, also known as a word embedding, has recently attracted much attention in natural language processing and information retrieval tasks. The embedding vectors are typically learned based
Ai Qingyao +15 more
core +1 more source
Microstructure Evolution of a VMnFeCoNi High‐Entropy Alloy After Synthesis, Swaging, and Annealing
The synthesis and processing (rotary swaging and annealing) of the novel VMnFeCoNi alloy is investigated, alongside the estimation of the grain size effect on hardness. Analysis of a wide grain size range of recrystallized microstructures (12–210 µm) reveals a low annealing twin density.
Aditya Srinivasan Tirunilai +6 more
wiley +1 more source
Word embedding is a technique for converting a word into a vector. These are known as word vectors. Despite the fact that word embedding offers multiple powerful approaches, these existing methods can yet be improved.
Andzar Tsaqif Laksana +2 more
doaj +1 more source
Rotations and Interpretability of Word Embeddings: the Case of the Russian Language
Consider a continuous word embedding model. Usually, the cosines between word vectors are used as a measure of similarity of words. These cosines do not change under orthogonal transformations of the embedding space.
Zobnin, Alexey
core +1 more source

