Results 271 to 280 of about 16,624 (311)

EGaIn‐Activated Bioinspired Silk Micro/Nanofibril Eutectogels Breaking the Strength–Conductivity Trade‐Off for High‐Performance Wearable Bioelectronics

open access: yesAdvanced Science, EarlyView.
Inspired by the extracellular matrix, eutectogels are prepared by in situ deconstruction of silk fibroin into micro/nanofibrils and EGaIn‐induced polymerization. The multiscale fibril network with dynamic crosslinking achieves high strength (1.25 MPa), toughness (23.09 MJ m−3), and conductivity (1.51 S m−1), outperforming previous natural polymer ...
Haiwei Yang   +4 more
wiley   +1 more source

A Subset of Pro‐inflammatory CXCL10+ LILRB2+ Macrophages Derives From Recipient Monocytes and Drives Renal Allograft Rejection

open access: yesAdvanced Science, EarlyView.
This study uncovers a recipient‐derived monocyte‐to‐macrophage trajectory that drives inflammation during kidney transplant rejection. Using over 150 000 single‐cell profiles and more than 850 biopsies, the authors identify CXCL10+ macrophages as key predictors of graft loss.
Alexis Varin   +16 more
wiley   +1 more source

isiZulu Word Embeddings

2021 Conference on Information Communications Technology and Society (ICTAS), 2021
Word embeddings are currently the most popular vector space model in Natural Language Processing. How we encode words is important because it affects the performance of many downstream tasks such as Machine Translation (MT), Information Retrieval (IR) and Automatic Speech Recognition (ASR).
Sibonelo Dlamini   +3 more
openaire   +1 more source

Topical Word Embeddings

Proceedings of the AAAI Conference on Artificial Intelligence, 2015
Most word embedding models typically represent each word using a single vector, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance discriminativeness, we employ latent topic models to assign topics for each word in the text corpus, and learn topical word embeddings (TWE) based on both ...
Yang Liu   +3 more
openaire   +1 more source

Word Embeddings

Abstract This chapter deals with the mathematical representation of words through vectors or embeddings which are the basis of modern language models. It starts by discussing the limits of the one-hot representation and continues with a section that presents traditional approaches based on the factorization of the word co-occurrence ...
Christophe Gaillac, Jérémy L'Hour
openaire   +2 more sources

Home - About - Disclaimer - Privacy