Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation [PDF]
Contextual embeddings represent a new generation of semantic representations learned from Neural Language Modelling (NLM) that addresses the issue of meaning conflation hampering traditional word embeddings.
Daniel Loureiro, A. Jorge
semanticscholar +1 more source
AMuSE-WSD: An All-in-one Multilingual System for Easy Word Sense Disambiguation
Over the past few years, Word Sense Disambiguation (WSD) has received renewed interest: recently proposed systems have shown the remarkable effectiveness of deep learning techniques in this task, especially when aided by modern pretrained language models.
Riccardo Orlando+4 more
semanticscholar +1 more source
Unsupervised, Knowledge-Free, and Interpretable Word Sense Disambiguation [PDF]
Interpretability of a predictive model is a powerful feature that gains the trust of users in the correctness of the predictions. In word sense disambiguation (WSD), knowledge-based systems tend to be much more interpretable than knowledge-free ...
Biemann, Chris+6 more
core +3 more sources
Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations [PDF]
Contextualized word representations are able to give different representations for the same word in different contexts, and they have been shown to be effective in downstream natural language processing tasks, such as question answering, named entity ...
Christian Hadiwinoto+2 more
semanticscholar +1 more source
Improved Word Sense Disambiguation with Enhanced Sense Representations
Current state-of-the-art supervised word sense disambiguation (WSD) systems (such as GlossBERT and bi-encoder model) yield sur-prisingly good results by purely leveraging pre-trained language models and short dictionary deļ¬nitions (or glosses) of the ...
Yang Song, Xin Cai Ong, H. Ng, Qian Lin
semanticscholar +1 more source
Lately proposed Word Sense Disambiguation (WSD) systems have approached the estimated upper bound of the task on standard evaluation benchmarks. However, these systems typically implement the disambiguation of words in a document almost independently ...
Ming Wang, Yinglin Wang
semanticscholar +1 more source
Word sense disambiguation with pictures [PDF]
We introduce a method for using images for word sense disambiguation, either alone, or in conjunction with traditional text based methods. The approach is based in recent work on a method for predicting words for images which can be learned from image datasets with associated text.
David Forsyth+2 more
openaire +2 more sources
Interpretability in Word Sense Disambiguation using Tsetlin Machine
: Word Sense Disambiguation (WSD) is a longstanding unresolved task in Natural Language Processing. The challenge lies in the fact that words with the same spelling can have completely different senses, sometimes depending on subtle characteristics of ...
Rohan Kumar Yadav+3 more
semanticscholar +1 more source
Adapting BERT for Word Sense Disambiguation with Gloss Selection Objective and Example Sentences [PDF]
Domain adaptation or transfer learning using pre-trained language models such as BERT has proven to be an effective approach for many natural language processing tasks.
Boon Peng Yap+2 more
semanticscholar +1 more source
Efficient estimation of Hindi WSD with distributed word representation in vector space
Word Sense Disambiguation (WSD) is significant for improving the accuracy of the interpretation of a Natural language text. Various supervised learning-based models and knowledge-based models have been developed in the literature for WSD of the language ...
Archana Kumari, D.K. Lobiyal
doaj