Results 211 to 220 of about 223,248 (273)
Visual Tracking Using Sparse Coding and Earth Mover's Distance. [PDF]
Yao G, Dani A.
europepmc +1 more source
This article establishes a Taguchi–Bayesian sampling strategy to reconstruct polymer processing–property landscape at minimal sampling cost, generically building the roadmap for materials database construction from sampling their vast design space. This sampling strategy is featured by an alternating lesson between uniformity and representativeness ...
Han Liu, Liantang Li
wiley +1 more source
Prediction of Apoptosis Protein Subcellular Localization with Multilayer Sparse Coding and Oversampling Approach. [PDF]
Chen X, Hu X, Yi W, Zou X, Xue W.
europepmc +1 more source
This study reveals that sampling strategy (i.e., sampling size and approach) is a foundational prerequisite for building accurate and generalizable AI models in peptide discovery. Reaching a threshold of 7.5% of the total tetrapeptide sequence space was essential to ensure reliable predictions.
Meiru Yan +3 more
wiley +1 more source
Toward a unified theory of efficient, predictive, and sparse coding. [PDF]
Chalk M, Marre O, Tkačik G.
europepmc +1 more source
Chat computational fluid dynamics (CFD) introduces an large language model (LLM)‐driven agent that automates OpenFOAM simulations end‐to‐end, attaining 82.1% execution success and 68.12% physical fidelity across 315 benchmarks—far surpassing prior systems.
E Fan +8 more
wiley +1 more source
Sparse coding of pathology slides compared to transfer learning with deep neural networks. [PDF]
Fischer W +4 more
europepmc +1 more source
A machine learning method, opt‐GPRNN, is presented that combines the advantages of neural networks and kernel regressions. It is based on additive GPR in optimized redundant coordinates and allows building a representation of the target with a small number of terms while avoiding overfitting when the number of terms is larger than optimal.
Sergei Manzhos, Manabu Ihara
wiley +1 more source
We propose a residual‐based adversarial‐gradient moving sample (RAMS) method for scientific machine learning that treats samples as trainable variables and updates them to maximize the physics residual, thereby effectively concentrating samples in inadequately learned regions.
Weihang Ouyang +4 more
wiley +1 more source

