Results 91 to 100 of about 2,031,469 (352)

Human papillomavirus (HPV) prediction for oropharyngeal cancer based on CT by using off‐the‐shelf features: A dual‐dataset study

open access: yesJournal of Applied Clinical Medical Physics, EarlyView.
Abstract Background This study aims to develop a novel predictive model for determining human papillomavirus (HPV) presence in oropharyngeal cancer using computed tomography (CT). Current image‐based HPV prediction methods are hindered by high computational demands or suboptimal performance.
Junhua Chen   +3 more
wiley   +1 more source

A New Sentiment-Enhanced Word Embedding Method for Sentiment Analysis

open access: yesApplied Sciences, 2022
Since some sentiment words have similar syntactic and semantic features in the corpus, existing pre-trained word embeddings always perform poorly in sentiment analysis tasks.
Qizhi Li   +4 more
doaj   +1 more source

Spanish Biomedical and Clinical Language Embeddings [PDF]

open access: yesarXiv, 2021
We computed both Word and Sub-word Embeddings using FastText. For Sub-word embeddings we selected Byte Pair Encoding (BPE) algorithm to represent the sub-words. We evaluated the Biomedical Word Embeddings obtaining better results than previous versions showing the implication that with more data, we obtain better representations.
arxiv  

A Comparative Study on Word Embeddings in Deep Learning for Text Classification

open access: yesInternational Conference on Natural Language Processing and Information Retrieval, 2020
Word embeddings act as an important component of deep models for providing input features in downstream language tasks, such as sequence labelling and text classification.
Congcong Wang, P. Nulty, David Lillis
semanticscholar   +1 more source

How to (Properly) Evaluate Cross-Lingual Word Embeddings: On Strong Baselines, Comparative Analyses, and Some Misconceptions [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2019
Cross-lingual word embeddings (CLEs) facilitate cross-lingual transfer of NLP models. Despite their ubiquitous downstream usage, increasingly popular projection-based CLE models are almost exclusively evaluated on bilingual lexicon induction (BLI).
Goran Glavas   +3 more
semanticscholar   +1 more source

Word Embeddings in Sentiment Analysis [PDF]

open access: yes, 2018
In the late years sentiment analysis and its applications have reached growing popularity. Concerning this field of research, in the very late years machine learning and word representation learning derived from distributional semantics field (i.e. word embeddings) have proven to be very successful in performing sentiment analysis tasks.
Petrolito R, Dell'Orletta F
openaire   +1 more source

Artificial Receptor in Synthetic Cells Performs Transmembrane Activation of Proteolysis

open access: yesAdvanced Biology, EarlyView.
Transmembrane signaling is the hallmark of living cells and is among the highest challenges for the design of synthetic cells. Herein, an artificial receptor based on the chemistry of self‐immolative linkers is used to communicate information across the lipid bilayer, for transmembrane activation of enzymatic activity. Abstract The design of artificial,
Ane Bretschneider Søgaard   +7 more
wiley   +1 more source

Understanding and Creating Word Embeddings

open access: yesThe Programming Historian
Word embeddings allow you to analyze the usage of different terms in a corpus of texts by capturing information about their contextual usage. Through a primarily theoretical lens, this lesson will teach you how to prepare a corpus and train a word ...
Avery Blankenship   +2 more
doaj   +1 more source

Word Tour: One-dimensional Word Embeddings via the Traveling Salesman Problem [PDF]

open access: yesarXiv, 2022
Word embeddings are one of the most fundamental technologies used in natural language processing. Existing word embeddings are high-dimensional and consume considerable computational resources. In this study, we propose WordTour, unsupervised one-dimensional word embeddings.
arxiv  

On the Dimensionality of Word Embedding

open access: yes, 2018
In this paper, we provide a theoretical understanding of word embedding and its dimensionality. Motivated by the unitary-invariance of word embedding, we propose the Pairwise Inner Product (PIP) loss, a novel metric on the dissimilarity between word embeddings.
Yin, Zi, Shen, Yuanyuan
openaire   +2 more sources

Home - About - Disclaimer - Privacy