Results 61 to 70 of about 89,062 (279)
Exploring the Privacy-Preserving Properties of Word Embeddings: Algorithmic Validation Study
BackgroundWord embeddings are dense numeric vectors used to represent language in neural networks. Until recently, there had been no publicly released embeddings trained on clinical data.
Abdalla, Mohamed +3 more
doaj +1 more source
WOVe: Incorporating Word Order in GloVe Word Embeddings
Word vector representations open up new opportunities to extract useful information from unstructured text. Defining a word as a vector made it easy for the machine learning algorithms to understand a text and extract information from. Word vector representations have been used in many applications such word synonyms, word analogy, syntactic parsing ...
Mohammed Salah Ibrahim +3 more
openaire +2 more sources
Freeze‐drying of layered silicate is the key to get coatings with superior gas barrier. Freeze‐drying of layered silicates modified with dodecylamine (DDA) is a highly effective technique for the preparation of barrier pigments that significantly mitigate the permeation of oxygen, water vapor, and hydrogen through polymer films containing these ...
Joshua Lommes +4 more
wiley +1 more source
Twitter is a social media site where people post their personal experiences, opinions, and news. Due to the ubiquitous real-time data availability, many rescue agencies monitor this data regularly to identify disasters, reduce risk, and save lives ...
Sumona Deb, Ashis Kumar Chanda
doaj +1 more source
What Do Large Language Models Know About Materials?
If large language models (LLMs) are to be used inside the material discovery and engineering process, they must be benchmarked for the accurateness of intrinsic material knowledge. The current work introduces 1) a reasoning process through the processing–structure–property–performance chain and 2) a tool for benchmarking knowledge of LLMs concerning ...
Adrian Ehrenhofer +2 more
wiley +1 more source
An Experimental Analysis of Deep Neural Network Based Classifiers for Sentiment Analysis Task
The application of natural language processing (NLP) in sentiment analysis task by using textual data has wide scale application across various domains in plethora of industries.
Mrigank Shukla, Akhil Kumar
doaj +1 more source
Semantic Structure and Interpretability of Word Embeddings
Dense word embeddings, which encode semantic meanings of words to low dimensional vector spaces have become very popular in natural language processing (NLP) research due to their state-of-the-art performances in many NLP tasks.
Cukur, Tolga +4 more
core +1 more source
Compressing Word Embeddings [PDF]
10 pages, 0 figures, submitted to ICONIP-2016. Previous experimental results were submitted to ICLR-2016, but the paper has been significantly updated, since a new experimental set-up worked much ...
openaire +2 more sources
Microstructure Evolution of a VMnFeCoNi High‐Entropy Alloy After Synthesis, Swaging, and Annealing
The synthesis and processing (rotary swaging and annealing) of the novel VMnFeCoNi alloy is investigated, alongside the estimation of the grain size effect on hardness. Analysis of a wide grain size range of recrystallized microstructures (12–210 µm) reveals a low annealing twin density.
Aditya Srinivasan Tirunilai +6 more
wiley +1 more source
Using Multi-Sense Vector Embeddings for Reverse Dictionaries [PDF]
Popular word embedding methods such as word2vec and GloVe assign a single vector representation to each word, even if a word has multiple distinct meanings. Multi-sense embeddings instead provide different vectors for each sense of a word.
de Melo, G. +3 more
core +1 more source

