Results 71 to 80 of about 89,062 (279)
A Mixture Model for Learning Multi-Sense Word Embeddings
Word embeddings are now a standard technique for inducing meaning representations for words. For getting good representations, it is important to take into account different senses of a word.
Modi, Ashutosh +4 more
core +1 more source
This work demonstrates the successful integration of a phenanthroline‐based 2D COF with MnI catalytic sites into a catholyte‐free membrane‐electrode‐assembly cell for CO2 electroreduction. The crystalline COF actively suppresses Mn⁰–Mn⁰ dimerization, achieving a turnover frequency of 617 h⁻¹ at 2.8 V (full‐cell potential), and enabling stable operation.
Laura Spies +8 more
wiley +1 more source
Spoken-Word Recognition: The Access to Embedded Words
Two cross-modal priming experiments investigated whether the representation of either an initial- or a final-embedded word may be activated when the longer carrier word is auditorily presented. Visual targets were semantically related either to the embedded word or to the carrier word or they were unrelated to the primes. A priming effect was found for
Isel, F., Bacri, N.
openaire +3 more sources
Activation of embedded words in spoken word recognition. [PDF]
Tilburg University Beatrice de Gelder Tilburg University and Universit6 Libre de Bruxelles Three cross-modal associative priming experiments investigated whether speech input acti- vates words that are embedded in other words.
Vroomen, Jean, De Gelder, Béatrice
openaire +2 more sources
Controlled syntheses of lanthanide coordination polymers based on the dihydroxybenzoquinone (DHBQ) organic linker afforded large single crystals of Ln‐DHBQ CPs (Ln = Yb, Nd). A novel structural variant of Yb‐DHBQ is identified by means of single crystal diffraction analysis.
Marina I. Schönherr +7 more
wiley +1 more source
Better Word Representation Vectors Using Syllabic Alphabet: A Case Study of Swahili
Deep learning has extensively been used in natural language processing with sub-word representation vectors playing a critical role. However, this cannot be said of Swahili, which is a low resource and widely spoken language in East and Central Africa ...
Casper S. Shikali +3 more
doaj +1 more source
A New Sentiment-Enhanced Word Embedding Method for Sentiment Analysis
Since some sentiment words have similar syntactic and semantic features in the corpus, existing pre-trained word embeddings always perform poorly in sentiment analysis tasks.
Qizhi Li +4 more
doaj +1 more source
Morphological Priors for Probabilistic Neural Word Embeddings
Word embeddings allow natural language processing systems to share statistical information across related words. These embeddings are typically based on distributional statistics, making it difficult for them to generalize to rare or unseen words.
Bhatia, Parminder +2 more
core +1 more source
For the first time, a highly sensitive electrochemical biosensor based on SiO2‐based hairy particles with a grafted PDMAEMA polymer brush containing a quantifiable and large amount of immobilized Laccase is reported. The fabricated biosensor exhibits a sensitivity of 0.14 A·m⁻¹, a limit of detection (LOD) of 0.1 µm, and a detection range of 0.3–750 µm,
Pavel Milkin +7 more
wiley +1 more source
A Nested Chinese Restaurant Topic Model for Short Texts with Document Embeddings
In recent years, short texts have become a kind of prevalent text on the internet. Due to the short length of each text, conventional topic models for short texts suffer from the sparsity of word co-occurrence information.
Yue Niu, Hongjie Zhang, Jing Li
doaj +1 more source

