Results 211 to 220 of about 8,360,833 (340)
CELLama is created, a framework that harnesses language models to convert cellular data into “sentences” that represent gene expression and metadata, enabling a universal embedding of cells. Unlike most single‐cell foundation models, CELLama supports scalable analysis and offers flexible applications including spatial transcriptomics.
Jeongbin Park +7 more
wiley +1 more source
Validating and prioritizing key indicators for blended MOOC implementation in English language learning using the fuzzy Delphi method. [PDF]
Pham AT, Abdullah MRTL.
europepmc +1 more source
Advanced French: Interactive Video Language Learning with "Au Coeur De la Loi"
Susan Carpenter Binkley
openalex +1 more source
MGM as a Large‐Scale Pretrained Foundation Model for Microbiome Analyses in Diverse Contexts
We present the Microbial General Model (MGM), a transformer‐based foundation model pretrained on over 260,000 microbiome samples. MGM learns contextualized microbial representations via self‐supervised language modeling, enabling robust transfer learning, cross‐regional generalization, keystone taxa discovery, and prompt‐guided generation of realistic,
Haohong Zhang +5 more
wiley +1 more source
Interbrain coupling during language learning contributes to learning outcomes. [PDF]
Shamay-Tsoory SG +3 more
europepmc +1 more source
Learned Conformational Space and Pharmacophore Into Molecular Foundational Model
The Ouroboros model introduces two orthogonal modules within a unified framework that independently learn molecular representations and generate chemical structures. This design enables flexible optimization strategies for each module and faithful structure reconstruction without prompts or noise.
Lin Wang +8 more
wiley +1 more source
Emotion regulation and perceptions of academic stress as key predictors of academic motivation in second language learning. [PDF]
Liang H, Mao X.
europepmc +1 more source

