Results 51 to 60 of about 83,408 (318)
Contextualized word embeddings have been employed effectively across several tasks in Natural Language Processing, as they have proved to carry useful semantic information. However, it is still hard to link them to structured sources of knowledge.
Bianca Scarlini+2 more
semanticscholar +1 more source
In natural language, the phenomenon of polysemy is widespread, which makes it very difficult for machines to process natural language. Word sense disambiguation is a key issue in the field of natural language processing.
Lei Wang, Qun Ai
doaj +1 more source
Nowadays, most research conducted in the field of abstractive text summarization focuses on neural-based models alone, without considering their combination with knowledge-based approaches that could further enhance their efficiency.
P. Kouris+2 more
semanticscholar +1 more source
Unsupervised, Knowledge-Free, and Interpretable Word Sense Disambiguation [PDF]
Interpretability of a predictive model is a powerful feature that gains the trust of users in the correctness of the predictions. In word sense disambiguation (WSD), knowledge-based systems tend to be much more interpretable than knowledge-free ...
Biemann, Chris+6 more
core +3 more sources
Word sense disambiguation with pictures
AbstractWe introduce using images for word sense disambiguation, either alone, or in conjunction with traditional text based methods. The approach is based on a recently developed method for automatically annotating images by using a statistical model for the joint probability for image regions and words. The model itself is learned from a data base of
JohnsonMatthew, BarnardKobus
openaire +3 more sources
Biomedical Word Sense Disambiguation with Word Embeddings [PDF]
There is a growing need for automatic extraction of information and knowledge from the increasing amount of biomedical and clinical data produced, namely in textual form. Natural language processing comes in this direction, helping in tasks such as information extraction and information retrieval.
Antunes, Rui, Matos, Sérgio
openaire +3 more sources
Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation [PDF]
Contextual embeddings represent a new generation of semantic representations learned from Neural Language Modelling (NLM) that addresses the issue of meaning conflation hampering traditional word embeddings.
Daniel Loureiro, A. Jorge
semanticscholar +1 more source
Improved Word Sense Disambiguation with Enhanced Sense Representations
Current state-of-the-art supervised word sense disambiguation (WSD) systems (such as GlossBERT and bi-encoder model) yield sur-prisingly good results by purely leveraging pre-trained language models and short dictionary definitions (or glosses) of the ...
Yang Song, Xin Cai Ong, H. Ng, Qian Lin
semanticscholar +1 more source
Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations [PDF]
Contextualized word representations are able to give different representations for the same word in different contexts, and they have been shown to be effective in downstream natural language processing tasks, such as question answering, named entity ...
Christian Hadiwinoto+2 more
semanticscholar +1 more source
Lately proposed Word Sense Disambiguation (WSD) systems have approached the estimated upper bound of the task on standard evaluation benchmarks. However, these systems typically implement the disambiguation of words in a document almost independently ...
Ming Wang, Yinglin Wang
semanticscholar +1 more source