Results 51 to 60 of about 259,269 (327)

Phonetic Word Embeddings

open access: yes, 2021
This work presents a novel methodology for calculating the phonetic similarity between words taking motivation from the human perception of sounds. This metric is employed to learn a continuous vector embedding space that groups similar sounding words together and can be used for various downstream computational phonology tasks.
Sharma, Rahul   +2 more
openaire   +2 more sources

Closed Form Word Embedding Alignment [PDF]

open access: yes2019 IEEE International Conference on Data Mining (ICDM), 2019
We develop a family of techniques to align word embeddings which are derived from different source datasets or created using different mechanisms (e.g., GloVe or word2vec). Our methods are simple and have a closed form to optimally rotate, translate, and scale to minimize root mean squared errors or maximize the average cosine similarity between two ...
Sunipa Dev   +2 more
openaire   +2 more sources

Tumor and germline testing with next generation sequencing in epithelial ovarian cancer: a prospective paired comparison using an 18‐gene panel

open access: yesMolecular Oncology, EarlyView.
Genetic testing in epithelial ovarian cancer includes both germline and tumor‐testing. This approach often duplicates resources. The current prospective study assessed the feasibility of tumor‐first multigene testing by comparing tumor tissue with germline testing of peripheral blood using an 18‐gene NGS panel in 106 patients.
Elisabeth Spenard   +12 more
wiley   +1 more source

Improving Word Embedding Using Variational Dropout

open access: yesProceedings of the International Florida Artificial Intelligence Research Society Conference, 2023
Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings.
Zainab Albujasim   +3 more
doaj   +1 more source

Rotations and Interpretability of Word Embeddings: the Case of the Russian Language

open access: yes, 2017
Consider a continuous word embedding model. Usually, the cosines between word vectors are used as a measure of similarity of words. These cosines do not change under orthogonal transformations of the embedding space.
Zobnin, Alexey
core   +1 more source

Quantum-Inspired Complex Word Embedding [PDF]

open access: yesProceedings of The Third Workshop on Representation Learning for NLP, 2018
A challenging task for word embeddings is to capture the emergent meaning or polarity of a combination of individual words. For example, existing approaches in word embeddings will assign high probabilities to the words "Penguin" and "Fly" if they frequently co-occur, but it fails to capture the fact that they occur in an opposite sense - Penguins do ...
Li, Qiuchi   +3 more
openaire   +2 more sources

Developing evidence‐based, cost‐effective P4 cancer medicine for driving innovation in prevention, therapeutics, patient care and reducing healthcare inequalities

open access: yesMolecular Oncology, EarlyView.
The cancer problem is increasing globally with projections up to the year 2050 showing unfavourable outcomes in terms of incidence and cancer‐related deaths. The main challenges are prevention, improved therapeutics resulting in increased cure rates and enhanced health‐related quality of life.
Ulrik Ringborg   +43 more
wiley   +1 more source

A supervised topic embedding model and its application.

open access: yesPLoS ONE, 2022
We propose rTopicVec, a supervised topic embedding model that predicts response variables associated with documents by analyzing the text data. Topic modeling leverages document-level word co-occurrence patterns to learn latent topics of each document ...
Weiran Xu, Koji Eguchi
doaj   +1 more source

A New Sentiment-Enhanced Word Embedding Method for Sentiment Analysis

open access: yesApplied Sciences, 2022
Since some sentiment words have similar syntactic and semantic features in the corpus, existing pre-trained word embeddings always perform poorly in sentiment analysis tasks.
Qizhi Li   +4 more
doaj   +1 more source

Embedding Semantic Relations into Word Representations [PDF]

open access: yes, 2015
Learning representations for semantic relations is important for various tasks such as analogy detection, relational search, and relation classification.
Bollegala, Danushka   +2 more
core   +1 more source

Home - About - Disclaimer - Privacy