Results 31 to 40 of about 2,115,444 (366)

Neuro-Symbolic Word Embedding Using Textual and Knowledge Graph Information

open access: yesApplied Sciences, 2022
The construction of high-quality word embeddings is essential in natural language processing. In existing approaches using a large text corpus, the word embeddings learn only sequential patterns in the context; thus, accurate learning of the syntax and ...
Dongsuk Oh, Jungwoo Lim, Heuiseok Lim
doaj   +1 more source

Diachronic Word Embeddings Reveal Statistical Laws of Semantic Change [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2016
Understanding how words change their meanings over time is key to models of language and cultural evolution, but historical data on meaning is scarce, making theories hard to develop and test.
William L. Hamilton   +2 more
semanticscholar   +1 more source

Be Careful about Poisoned Word Embeddings: Exploring the Vulnerability of the Embedding Layers in NLP Models [PDF]

open access: yesNorth American Chapter of the Association for Computational Linguistics, 2021
Recent studies have revealed a security threat to natural language processing (NLP) models, called the Backdoor Attack. Victim models can maintain competitive performance on clean samples while behaving abnormally on samples with a specific trigger word ...
Wenkai Yang   +5 more
semanticscholar   +1 more source

A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2018
Recent work has managed to learn cross-lingual word embeddings without parallel data by mapping monolingual embeddings to a shared space through adversarial training. However, their evaluation has focused on favorable conditions, using comparable corpora
Mikel Artetxe   +2 more
semanticscholar   +1 more source

A Monolingual Approach to Contextualized Word Embeddings for Mid-Resource Languages [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2020
We use the multilingual OSCAR corpus, extracted from Common Crawl via language classification, filtering and cleaning, to train monolingual contextualized word embeddings (ELMo) for five mid-resource languages.
Pedro Ortiz Suarez   +2 more
semanticscholar   +1 more source

Multi-sense Embeddings Using Synonym Sets and Hypernym Information from Wordnet

open access: yesInternational Journal of Interactive Multimedia and Artificial Intelligence, 2021
Word embedding approaches increased the efficiency of natural language processing (NLP) tasks. Traditional word embeddings though robust for many NLP activities, do not handle polysemy of words.
Krishna Siva Prasad Mudigonda   +1 more
doaj   +1 more source

Word embeddings quantify 100 years of gender and ethnic stereotypes [PDF]

open access: yesProceedings of the National Academy of Sciences of the United States of America, 2017
Significance Word embeddings are a popular machine-learning method that represents each English word by a vector, such that the geometry between these vectors captures semantic relations between the corresponding words.
Nikhil Garg   +3 more
semanticscholar   +1 more source

Detecting Emergent Intersectional Biases: Contextualized Word Embeddings Contain a Distribution of Human-like Biases [PDF]

open access: yesAAAI/ACM Conference on AI, Ethics, and Society, 2020
With the starting point that implicit human biases are reflected in the statistical regularities of language, it is possible to measure biases in English static word embeddings.
W. Guo, Aylin Caliskan
semanticscholar   +1 more source

Learned Text Representation for Amharic Information Retrieval and Natural Language Processing

open access: yesInformation, 2023
Over the past few years, word embeddings and bidirectional encoder representations from transformers (BERT) models have brought better solutions to learning text representations for natural language processing (NLP) and other tasks. Many NLP applications
Tilahun Yeshambel   +2 more
doaj   +1 more source

Word Alignment by Fine-tuning Embeddings on Parallel Corpora [PDF]

open access: yesConference of the European Chapter of the Association for Computational Linguistics, 2021
Word alignment over parallel corpora has a wide variety of applications, including learning translation lexicons, cross-lingual transfer of language processing tools, and automatic evaluation or analysis of translation outputs. The great majority of past
Zi-Yi Dou, Graham Neubig
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy