Results 51 to 60 of about 89,062 (279)
Ontology-Aware Token Embeddings for Prepositional Phrase Attachment
Type-level word embeddings use the same set of parameters to represent all instances of a word regardless of its context, ignoring the inherent lexical ambiguity in language.
Ammar, Waleed +3 more
core +1 more source
To integrate multiple transcriptomics data with severe batch effects for identifying MB subtypes, we developed a novel and accurate computational method named RaMBat, which leveraged subtype‐specific gene expression ranking information instead of absolute gene expression levels to address batch effects of diverse data sources.
Mengtao Sun, Jieqiong Wang, Shibiao Wan
wiley +1 more source
Transformer models are the state-of-the-art in Natural Language Processing (NLP) and the core of the Large Language Models (LLMs). We propose a transformer-based model for transition-based dependency parsing of free word order languages.
Fatima Tuz Zuhra +2 more
doaj +1 more source
Quantum-Inspired Complex Word Embedding [PDF]
A challenging task for word embeddings is to capture the emergent meaning or polarity of a combination of individual words. For example, existing approaches in word embeddings will assign high probabilities to the words "Penguin" and "Fly" if they frequently co-occur, but it fails to capture the fact that they occur in an opposite sense - Penguins do ...
Li, Qiuchi +3 more
openaire +2 more sources
Objective To support high‐quality, patient‐centered care for systemic lupus erythematosus (SLE), the American College of Rheumatology (ACR) developed evidence‐based measures incorporating clinical and patient‐reported outcomes measures (PROMs). Using the Consolidated Framework for Implementation Research (CFIR), we conducted semi‐structured interviews ...
Catherine Nasrallah +13 more
wiley +1 more source
Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis
Long short-term memory networks (LSTMs) have gained good performance in sentiment analysis tasks. The general method is to use LSTMs to combine word embeddings for text representation.
Xianghua Fu +4 more
doaj +1 more source
Learning Bilingual Word Embedding Mappings with Similar Words in Related Languages Using GAN
Cross-lingual word embeddings display words from different languages in the same vector space. They provide reasoning about semantics, compare the meaning of words across languages and word meaning in multilingual contexts, necessary to bilingual lexicon
Ghafour Alipour +2 more
doaj +1 more source
Obtaining high-quality embeddings of out-of-vocabularies (OOVs) and low-frequency words is a challenge in natural language processing (NLP). To efficiently estimate the embeddings of OOVs and low-frequency words, we propose a new method that uses the ...
Xianwen Liao +5 more
doaj +1 more source
Query Expansion with Locally-Trained Word Embeddings
Continuous space word embeddings have received a great deal of attention in the natural language processing and machine learning communities for their ability to model term similarity and other relationships.
Craswell, Nick +2 more
core +1 more source
A Robust Adaptive One‐Sample‐Ahead Preview Super‐Twisting Sliding Mode Controller
Block Diagram of the Robust Adaptive One‐Sample‐Ahead Preview Super‐Twisting Sliding Mode Controller. ABSTRACT This article introduces a discrete‐time robust adaptive one‐sample‐ahead preview super‐twisting sliding mode controller. A stability analysis of the controller by Lyapunov criteria is developed to demonstrate its robustness in handling both ...
Guilherme Vieira Hollweg +5 more
wiley +1 more source

