Results 31 to 40 of about 89,062 (279)

When FastText Pays Attention: Efficient Estimation of Word Representations using Constrained Positional Weighting [PDF]

open access: yesJournal of Universal Computer Science, 2022
In 2018, Mikolov et al. introduced the positional language model, which has characteristics of attention-based neural machine translation models and which achieved state-of-the-art performance on the intrinsic word analogy task.
Vít Novotný   +4 more
doaj   +3 more sources

SensEmbed: Learning sense embeddings for word and relational similarity [PDF]

open access: yes, 2015
Word embeddings have recently gained considerable popularity for modeling words in different Natural Language Processing (NLP) tasks including semantic similarity measurement.
IACOBACCI, IGNACIO JAVIER   +2 more
core   +2 more sources

Biomedical Word Sense Disambiguation with Word Embeddings [PDF]

open access: yes, 2017
There is a growing need for automatic extraction of information and knowledge from the increasing amount of biomedical and clinical data produced, namely in textual form. Natural language processing comes in this direction, helping in tasks such as information extraction and information retrieval.
Antunes, Rui, Matos, Sérgio
openaire   +2 more sources

A Gloss Composition and Context Clustering Based Distributed Word Sense Representation Model

open access: yesEntropy, 2015
In recent years, there has been an increasing interest in learning a distributed representation of word sense. Traditional context clustering based models usually require careful tuning of model parameters, and typically perform worse on infrequent word ...
Tao Chen   +3 more
doaj   +1 more source

Efficient estimation of Hindi WSD with distributed word representation in vector space

open access: yesJournal of King Saud University: Computer and Information Sciences, 2022
Word Sense Disambiguation (WSD) is significant for improving the accuracy of the interpretation of a Natural language text. Various supervised learning-based models and knowledge-based models have been developed in the literature for WSD of the language ...
Archana Kumari, D.K. Lobiyal
doaj   +1 more source

Word Embeddings for Entity-annotated Texts

open access: yes, 2020
Learned vector representations of words are useful tools for many information retrieval and natural language processing tasks due to their ability to capture lexical semantics.
A Das   +15 more
core   +1 more source

Distilling Word Embeddings

open access: yesProceedings of the 25th ACM International on Conference on Information and Knowledge Management, 2016
Distilling knowledge from a well-trained cumbersome network to a small one has recently become a new research topic, as lightweight neural networks with high performance are particularly in need in various resource-restricted systems. This paper addresses the problem of distilling word embeddings for NLP tasks.
Mou, Lili   +5 more
openaire   +2 more sources

Reliable Classification of FAQs with Spelling Errors Using an Encoder-Decoder Neural Network in Korean

open access: yesApplied Sciences, 2019
To resolve lexical disagreement problems between queries and frequently asked questions (FAQs), we propose a reliable sentence classification model based on an encoder-decoder neural network.
Youngjin Jang, Harksoo Kim
doaj   +1 more source

Graph-Embedding Empowered Entity Retrieval

open access: yes, 2020
In this research, we improve upon the current state of the art in entity retrieval by re-ranking the result list using graph embeddings. The paper shows that graph embeddings are useful for entity-oriented search tasks.
D Metzler   +7 more
core   +1 more source

The Word Analogy Testing Caveat [PDF]

open access: yes, 2018
There are some important problems in the evaluation of word embeddings using standard word analogy tests. In particular, in virtue of the assumptions made by systems generating the embeddings, these remain tests over randomness.
Schluter, Natalie
core   +1 more source

Home - About - Disclaimer - Privacy