Results 91 to 100 of about 25,556 (192)
PyFuRNAce: an integrated design engine for RNA origami. [PDF]
Monari L +4 more
europepmc +1 more source
Here, a bioinspired light‐accelerated neuromorphic system that seamlessly links tactile sensing, first‐spike‐timing (FST) encoding, and light–electric synaptic learning. Pressure stimuli trigger FST spikes in dual‐gate PDTFTs, while GaOx/IGZO hetero‐synapses exhibit enhanced memory under optical–electrical co‐activation, enabling spiking neural ...
Dan Cai +9 more
wiley +1 more source
Hybrid nanofluid-based targeted drug delivery system for tumor therapy under magnetic and thermal control. [PDF]
Zar PM, Zar VM.
europepmc +1 more source
CDK4/6 inhibition promotes CD8+ T cell expansion through tumor‐macrophage crosstalk by activating HIF‐1α and enhancing MIF‐CD44/CD74 signaling. This reprograms TAMs to boost MHC‐I antigen presentation, and CDK4/6 inhibitor‐trained M1 TAM supernatant therapy synergizes with low‐dose PD‐1 blockade to restore antitumor immunity.
Lin He +17 more
wiley +1 more source
CANA v1.0.0: efficient quantification of canalization in automata networks. [PDF]
Marcus AM, Rozum J, Sizek H, Rocha LM.
europepmc +1 more source
Tumor evolution in lung adenocarcinoma is shaped by genetic alterations and spatial immune dynamics. By integrating whole‐exome sequencing, imaging mass cytometry, and spatial transcriptomics across two mouse models, this study reveals how mutational burden, immune infiltration, and cell–state interactions evolve during early and late carcinogenesis ...
Bo Zhu +34 more
wiley +1 more source
PBRM1 ranks as the second most commonly mutated gene in ccRCC. This study reveals that PBRM1 loss promotes an immunosuppressive microenvironment by elevating M2 TAMs via the KDM5C–IL‐6 axis. These M2 TAMs, along with CAFs, form a barrier that excludes CD8+ T cells. Targeting IL‐6 synergizes with anti‐PD1 therapy, offering a promising strategy for PBRM1‐
Wenjiao Xia +14 more
wiley +1 more source
Ten quick tips for developing a reproducible Shiny application. [PDF]
Brun J, Janée G, Curty RG.
europepmc +1 more source
MGM as a Large‐Scale Pretrained Foundation Model for Microbiome Analyses in Diverse Contexts
We present the Microbial General Model (MGM), a transformer‐based foundation model pretrained on over 260,000 microbiome samples. MGM learns contextualized microbial representations via self‐supervised language modeling, enabling robust transfer learning, cross‐regional generalization, keystone taxa discovery, and prompt‐guided generation of realistic,
Haohong Zhang +5 more
wiley +1 more source
ItemComplex: A Python-based visualization framework for ex-post organization and integration of large language-based datasets. [PDF]
Janson K +8 more
europepmc +1 more source

