Results 111 to 120 of about 130,387 (281)
Bacterial α‐diversity decreases, but stochasticity and community stability increase across the 15 m‐depth vertical profiles and along the degraded gradient within the active layer. The abundance and interaction of core taxa mainly control community stability in the active and permafrost layers, respectively.
Shengyun Chen +13 more
wiley +1 more source
Heuristically Adaptive Diffusion‐Model Evolutionary Strategy
Building on the mathematical equivalence between diffusion models and evolutionary algorithms, researchers demonstrate unprecedented control over evolutionary optimization through conditional diffusion. By training diffusion models to associate parameters with specific traits, they can guide evolution toward solutions exhibiting desired behaviors ...
Benedikt Hartl +3 more
wiley +1 more source
MGM as a Large‐Scale Pretrained Foundation Model for Microbiome Analyses in Diverse Contexts
We present the Microbial General Model (MGM), a transformer‐based foundation model pretrained on over 260,000 microbiome samples. MGM learns contextualized microbial representations via self‐supervised language modeling, enabling robust transfer learning, cross‐regional generalization, keystone taxa discovery, and prompt‐guided generation of realistic,
Haohong Zhang +5 more
wiley +1 more source
ACHO: Adaptive Conformal Hyperparameter Optimization
Several novel frameworks for hyperparameter search have emerged in the last decade, but most rely on strict, often normal, distributional assumptions, limiting search model flexibility. This paper proposes a novel optimization framework based on upper confidence bound sampling of conformal confidence intervals, whose weaker assumption of ...
openaire +2 more sources
Evaluating the Utilities of Foundation Models in Single‐Cell Data Analysis
This study delivers the first systematic, task‐level evaluation of single‐cell foundation models across eight core analytical tasks. By benchmarking 10 leading models with the scEval framework, it reveals where foundation models truly add value, where task‐specific methods still dominate, and provides concrete, reproducible guidelines to steer the next
Tianyu Liu +4 more
wiley +1 more source
Overtuning in Hyperparameter Optimization
Accepted at the Fourth Conference on Automated Machine Learning (Methods Track).
Schneider, Lennart +2 more
openaire +2 more sources
Hyperparameters: Optimize, or Integrate Out? [PDF]
I examine two approximate methods for computational implementation of Bayesian hierarchical models, that is, models which include unknown hyperparameters such as regularization constants. In the ‘evidence framework’ the model parameters are integrated over, and the resulting evidence is maximized over the hyperparameters.
openaire +1 more source
De Novo Multi‐Mechanism Antimicrobial Peptide Design via Multimodal Deep Learning
Current AI‐driven peptide discovery often overlooks complex structural data. This study presents M3‐CAD, a generative pipeline that leverages 3D voxel coloring and a massive database of over 12 000 peptides to capture nuanced physicochemical contexts.
Xiaojuan Li +23 more
wiley +1 more source
A Hybrid Optimization Model for Transformer Fault Diagnosis Based on Gas Classification
Dissolved gas analysis (DGA) provides valuable information for transformer condition monitoring, yet accurate multi-class fault identification remains challenging due to overlapping gas patterns and the sensitivity of classifier hyperparameters.
Junju Lai +6 more
doaj +1 more source
A two-stage renal disease classification based on transfer learning with hyperparameters optimization. [PDF]
Badawy M +5 more
europepmc +1 more source

