Results 211 to 220 of about 8,360,833 (340)

CELLama: Foundation Model for Single Cell and Spatial Transcriptomics by Cell Embedding Leveraging Language Model Abilities

open access: yesAdvanced Science, EarlyView.
CELLama is created, a framework that harnesses language models to convert cellular data into “sentences” that represent gene expression and metadata, enabling a universal embedding of cells. Unlike most single‐cell foundation models, CELLama supports scalable analysis and offers flexible applications including spatial transcriptomics.
Jeongbin Park   +7 more
wiley   +1 more source

MGM as a Large‐Scale Pretrained Foundation Model for Microbiome Analyses in Diverse Contexts

open access: yesAdvanced Science, EarlyView.
We present the Microbial General Model (MGM), a transformer‐based foundation model pretrained on over 260,000 microbiome samples. MGM learns contextualized microbial representations via self‐supervised language modeling, enabling robust transfer learning, cross‐regional generalization, keystone taxa discovery, and prompt‐guided generation of realistic,
Haohong Zhang   +5 more
wiley   +1 more source

Interbrain coupling during language learning contributes to learning outcomes. [PDF]

open access: yesSoc Cogn Affect Neurosci
Shamay-Tsoory SG   +3 more
europepmc   +1 more source

Learned Conformational Space and Pharmacophore Into Molecular Foundational Model

open access: yesAdvanced Science, EarlyView.
The Ouroboros model introduces two orthogonal modules within a unified framework that independently learn molecular representations and generate chemical structures. This design enables flexible optimization strategies for each module and faithful structure reconstruction without prompts or noise.
Lin Wang   +8 more
wiley   +1 more source

Home - About - Disclaimer - Privacy