Results 181 to 190 of about 4,930,132 (374)
Enhancing learning on uncertain pixels in self-distillation for object segmentation
Self-distillation method guides the model learning via transferring knowledge of the model itself, which has shown the advantages in object segmentation.
Lei Chen +5 more
doaj +1 more source
Poking Pluripotency: Nanoinjection Into Human iPSCs
Nanoinjection into hiPSCs: silicon nanotubes effectively transfect human induced pluripotent stem cells (hiPSCs) with mRNA, enabled by a delayed extracellular matrix application and enhanced surface functionalization. Nanoinjection is demonstrated with several reporter mRNA, including co‐transfection of mCherry and GFP.
Jann Harberts +5 more
wiley +1 more source
Decoupled Classifier Knowledge Distillation.
Mainstream knowledge distillation methods primarily include self-distillation, offline distillation, online distillation, output-based distillation, and feature-based distillation.
Hairui Wang +3 more
doaj +1 more source
Self-supervised Knowledge Distillation for Few-shot Learning [PDF]
Jathushan Rajasegaran +4 more
openalex +1 more source
Annealing Self-Distillation Rectification Improves Adversarial Training
In standard adversarial training, models are optimized to fit one-hot labels within allowable adversarial perturbation budgets. However, the ignorance of underlying distribution shifts brought by perturbations causes the problem of robust overfitting. To
Chen, Shang-Tse +2 more
core
Self‐Regulating Sodium‐Ion Battery Materials: From Phase Reconstruction to Functional Activation
Self‐regulating sodium‐ion batteries hinge on programmable phase behavior in layered oxides, interphases that renew without growth, and electrolytes that steer solvation and chemistry. This review distills mechanisms into design rules that span composition, site and entropy tuning, and self‐buffering anodes, linking operando evidence to choices and ...
Hong Gao +10 more
wiley +1 more source
Improving deep metric learning via self-distillation and online batch diffusion process [PDF]
Zelong Zeng +3 more
openalex +1 more source
DinoSR: Self-Distillation and Online Clustering for Self-supervised Speech Representation Learning
In this paper, we introduce self-distillation and online clustering for self-supervised speech representation learning (DinoSR) which combines masked language modeling, self-distillation, and online clustering. We show that these concepts complement each
Auli, Michael +4 more
core
Conductive Hydrogels for Exogenous Sensing and Cell Fate Control
We engineer electrically conductive hydrogels by combining sulfated glycosaminoglycans with semiconducting polymers. These hydrogels bind bioactive proteins, including growth factors, whose release or retention can be modulated by low‐voltage stimulation. The hydrogels are also integrated as 3D channels in organic electrochemical transistors as part of
Teuku Fawzul Akbar +15 more
wiley +1 more source
Diverse Feature Learning by Self-distillation and Reset
Our paper addresses the problem of models struggling to learn diverse features, due to either forgetting previously learned features or failing to learn new ones.
Park, Sejik
core

