Results 61 to 70 of about 89,062 (279)

Exploring the Privacy-Preserving Properties of Word Embeddings: Algorithmic Validation Study

open access: yesJournal of Medical Internet Research, 2020
BackgroundWord embeddings are dense numeric vectors used to represent language in neural networks. Until recently, there had been no publicly released embeddings trained on clinical data.
Abdalla, Mohamed   +3 more
doaj   +1 more source

WOVe: Incorporating Word Order in GloVe Word Embeddings

open access: yesInternational Journal on Engineering, Science and Technology, 2022
Word vector representations open up new opportunities to extract useful information from unstructured text. Defining a word as a vector made it easy for the machine learning algorithms to understand a text and extract information from. Word vector representations have been used in many applications such word synonyms, word analogy, syntactic parsing ...
Mohammed Salah Ibrahim   +3 more
openaire   +2 more sources

Adjustment of Coatings Morphology and Particle Distribution of Layered Silicates by Freeze‐Drying for Improved Gas Barriers

open access: yesAdvanced Engineering Materials, EarlyView.
Freeze‐drying of layered silicate is the key to get coatings with superior gas barrier. Freeze‐drying of layered silicates modified with dodecylamine (DDA) is a highly effective technique for the preparation of barrier pigments that significantly mitigate the permeation of oxygen, water vapor, and hydrogen through polymer films containing these ...
Joshua Lommes   +4 more
wiley   +1 more source

Comparative analysis of contextual and context-free embeddings in disaster prediction from Twitter data

open access: yesMachine Learning with Applications, 2022
Twitter is a social media site where people post their personal experiences, opinions, and news. Due to the ubiquitous real-time data availability, many rescue agencies monitor this data regularly to identify disasters, reduce risk, and save lives ...
Sumona Deb, Ashis Kumar Chanda
doaj   +1 more source

What Do Large Language Models Know About Materials?

open access: yesAdvanced Engineering Materials, EarlyView.
If large language models (LLMs) are to be used inside the material discovery and engineering process, they must be benchmarked for the accurateness of intrinsic material knowledge. The current work introduces 1) a reasoning process through the processing–structure–property–performance chain and 2) a tool for benchmarking knowledge of LLMs concerning ...
Adrian Ehrenhofer   +2 more
wiley   +1 more source

An Experimental Analysis of Deep Neural Network Based Classifiers for Sentiment Analysis Task

open access: yesIEEE Access, 2023
The application of natural language processing (NLP) in sentiment analysis task by using textual data has wide scale application across various domains in plethora of industries.
Mrigank Shukla, Akhil Kumar
doaj   +1 more source

Semantic Structure and Interpretability of Word Embeddings

open access: yes, 2018
Dense word embeddings, which encode semantic meanings of words to low dimensional vector spaces have become very popular in natural language processing (NLP) research due to their state-of-the-art performances in many NLP tasks.
Cukur, Tolga   +4 more
core   +1 more source

Compressing Word Embeddings [PDF]

open access: yes, 2016
10 pages, 0 figures, submitted to ICONIP-2016. Previous experimental results were submitted to ICLR-2016, but the paper has been significantly updated, since a new experimental set-up worked much ...
openaire   +2 more sources

Microstructure Evolution of a VMnFeCoNi High‐Entropy Alloy After Synthesis, Swaging, and Annealing

open access: yesAdvanced Engineering Materials, EarlyView.
The synthesis and processing (rotary swaging and annealing) of the novel VMnFeCoNi alloy is investigated, alongside the estimation of the grain size effect on hardness. Analysis of a wide grain size range of recrystallized microstructures (12–210 µm) reveals a low annealing twin density.
Aditya Srinivasan Tirunilai   +6 more
wiley   +1 more source

Using Multi-Sense Vector Embeddings for Reverse Dictionaries [PDF]

open access: yes, 2019
Popular word embedding methods such as word2vec and GloVe assign a single vector representation to each word, even if a word has multiple distinct meanings. Multi-sense embeddings instead provide different vectors for each sense of a word.
de Melo, G.   +3 more
core   +1 more source

Home - About - Disclaimer - Privacy