Results 151 to 160 of about 14,848 (289)
Enhancing learning on uncertain pixels in self-distillation for object segmentation
Self-distillation method guides the model learning via transferring knowledge of the model itself, which has shown the advantages in object segmentation.
Lei Chen +5 more
doaj +1 more source
DinoSR: Self-Distillation and Online Clustering for Self-supervised Speech Representation Learning
In this paper, we introduce self-distillation and online clustering for self-supervised speech representation learning (DinoSR) which combines masked language modeling, self-distillation, and online clustering. We show that these concepts complement each
Auli, Michael +4 more
core
Nanocrystal Quality–Controlled Scintillation in Porous YAG:Ce Aerogels
High‐temperature thermal treatment of YAG:Ce nanoscintillator aerogels increases light yield under ionizing radiation by up to eightfold without affecting timing performance. Additionally, low‐temperature atmospheric treatments reversibly modify the cerium oxidation state, revealing strong defect sensitivity and offering new strategies to optimize ...
Pavlo Mai +11 more
wiley +1 more source
Decoupled Classifier Knowledge Distillation.
Mainstream knowledge distillation methods primarily include self-distillation, offline distillation, online distillation, output-based distillation, and feature-based distillation.
Hairui Wang +3 more
doaj +1 more source
Smooth and Stepwise Self-Distillation for Object Detection
Distilling the structured information captured in feature maps has contributed to improved results for object detection tasks, but requires careful selection of baseline architectures and substantial pre-training.
Aguiar, Derek +4 more
core
On–Off Switchable Micromotors for Use in Steerable Microvehicles
Electrically controllable micromotors and microvehicles are developed by tuning the diffusion of the fuel. Self‐propelled micromotors using bubble propulsion show great promise for miniaturized devices with multiuse purposes such as cargo delivery and sensing. However, there is currently no method to electrically switch the micromotors on or off. Here,
Hugo Severinsson +3 more
wiley +1 more source
Continual Learning for Multimodal Data Fusion of a Soft Gripper
Models trained on a single data modality often struggle to generalize when exposed to a different modality. This work introduces a continual learning algorithm capable of incrementally learning different data modalities by leveraging both class‐incremental and domain‐incremental learning scenarios in an artificial environment where labeled data is ...
Nilay Kushawaha, Egidio Falotico
wiley +1 more source
SkillFactory: Self-Distillation For Learning Cognitive Behaviors
Reasoning models leveraging long chains of thought employ various cognitive skills, such as verification of their answers, backtracking, retrying by an alternate method, and more. Previous work has shown that when a base language model exhibits these skills, training that model further with reinforcement learning (RL) can learn to leverage them.
Sprague, Zayne +5 more
openaire +2 more sources
This work reports on the actuation behavior of CP‐based polypyrrole/PVdF TYs soaked in electrolyte/ionic liquids and woven actuators where such TYs are integrated. These studies are important toward the creation of on‐body applications such as wearable soft robotics, where textiles offer many advantages such as increased force, integrated electrical ...
Carin Backe +4 more
wiley +1 more source
A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation
Recent years have witnessed great success in handling graph-related tasks with Graph Neural Networks (GNNs). Despite their great academic success, Multi-Layer Perceptrons (MLPs) remain the primary workhorse for practical industrial applications.
Gao, Zhangyang +4 more
core

