Results 71 to 80 of about 254,842 (278)
This work demonstrates the successful integration of a phenanthroline‐based 2D COF with MnI catalytic sites into a catholyte‐free membrane‐electrode‐assembly cell for CO2 electroreduction. The crystalline COF actively suppresses Mn⁰–Mn⁰ dimerization, achieving a turnover frequency of 617 h⁻¹ at 2.8 V (full‐cell potential), and enabling stable operation.
Laura Spies +8 more
wiley +1 more source
Spoken-Word Recognition: The Access to Embedded Words
Two cross-modal priming experiments investigated whether the representation of either an initial- or a final-embedded word may be activated when the longer carrier word is auditorily presented. Visual targets were semantically related either to the embedded word or to the carrier word or they were unrelated to the primes. A priming effect was found for
Isel, F., Bacri, N.
openaire +3 more sources
Activation of embedded words in spoken word recognition. [PDF]
Tilburg University Beatrice de Gelder Tilburg University and Universit6 Libre de Bruxelles Three cross-modal associative priming experiments investigated whether speech input acti- vates words that are embedded in other words.
Vroomen, Jean, De Gelder, Béatrice
openaire +2 more sources
Controlled syntheses of lanthanide coordination polymers based on the dihydroxybenzoquinone (DHBQ) organic linker afforded large single crystals of Ln‐DHBQ CPs (Ln = Yb, Nd). A novel structural variant of Yb‐DHBQ is identified by means of single crystal diffraction analysis.
Marina I. Schönherr +7 more
wiley +1 more source
A triple joint extraction method combining hybrid embedding and relational label embedding
The purpose of triple extraction is to obtain relationships between entities from unstructured text and apply them to downstream tasks.The embedding mechanism has a great impact on the performance of the triple extraction model, and the embedding vector ...
Jianfeng DAI +3 more
doaj +2 more sources
Sentiment-Aware Word Embedding for Emotion Classification
Word embeddings are effective intermediate representations for capturing semantic regularities between words in natural language processing (NLP) tasks. We propose sentiment-aware word embedding for emotional classification, which consists of integrating
Xingliang Mao +4 more
doaj +1 more source
WordRank: Learning Word Embeddings via Robust Ranking
Embedding words in a vector space has gained a lot of attention in recent years. While state-of-the-art methods provide efficient computation of word similarities via a low-dimensional matrix embedding, their motivation is often left unclear.
Ji, Shihao +4 more
core +1 more source
For the first time, a highly sensitive electrochemical biosensor based on SiO2‐based hairy particles with a grafted PDMAEMA polymer brush containing a quantifiable and large amount of immobilized Laccase is reported. The fabricated biosensor exhibits a sensitivity of 0.14 A·m⁻¹, a limit of detection (LOD) of 0.1 µm, and a detection range of 0.3–750 µm,
Pavel Milkin +7 more
wiley +1 more source
Refining electronic medical records representation in manifold subspace
Background Electronic medical records (EMR) contain detailed information about patient health. Developing an effective representation model is of great significance for the downstream applications of EMR.
Bolin Wang +5 more
doaj +1 more source
Word Mover’s Embedding: From Word2Vec to Document Embedding [PDF]
While the celebrated Word2Vec technique yields semantically rich representations for individual words, there has been relatively less success in extending to generate unsupervised sentences or documents embeddings. Recent work has demonstrated that a distance measure between documents called \emph{Word Mover's Distance} (WMD) that aligns semantically ...
Wu, Lingfei +7 more
openaire +2 more sources

