Results 31 to 40 of about 14,848 (289)
CORSD: Class-Oriented Relational Self Distillation
Knowledge distillation conducts an effective model compression method while holding some limitations:(1) the feature based distillation methods only focus on distilling the feature map but are lack of transferring the relation of data examples; (2) the relational distillation methods are either limited to the handcrafted functions for relation ...
Yu, Muzhou +5 more
openaire +2 more sources
SdAE: Self-distillated Masked Autoencoder
With the development of generative-based self-supervised learning (SSL) approaches like BeiT and MAE, how to learn good representations by masking random patches of the input image and reconstructing the missing information has grown in concern. However, BeiT and PeCo need a "pre-pretraining" stage to produce discrete codebooks for masked patches ...
Chen, Yabo +6 more
openaire +2 more sources
Production of ethanol from sugars and lignocellulosic biomass by Thermoanaerobacter J1 Isolated from a hot spring in Iceland [PDF]
Thermophilic bacteria have gained increased attention as candidates for bioethanol production from lignocellulosic biomass. This study investigated ethanol production by Thermoanaerobacter strain J1 from hydrolysates made from lignocellulosic biomass in ...
Jessen, Jan Eric, Orlygsson, Johann
core +2 more sources
Self-distilled Vision Transformer for Domain Generalization
23 pages, 12 ...
Sultana, Maryam +4 more
openaire +2 more sources
Extensive research studies have been conducted in recent years to exploit the complementarity among multisensor (or multimodal) remote sensing data for prominent applications such as land cover mapping.
Yawogan Jean Eudes Gbodjo +4 more
doaj +1 more source
Recent advances in second generation ethanol production by thermophilic bacteria [PDF]
There is an increased interest in using thermophilic bacteria for the production of bioethanol from complex lignocellulosic biomass due to their higher operating temperatures and broad substrate range.
Orlygsson, Johann, Scully, Sean
core +2 more sources
LGD: Label-Guided Self-Distillation for Object Detection
In this paper, we propose the first self-distillation framework for general object detection, termed LGD (Label-Guided self-Distillation). Previous studies rely on a strong pretrained teacher to provide instructive knowledge that could be unavailable in real-world scenarios.
Zhang, Peizhen +5 more
openaire +2 more sources
Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation
Convolutional neural networks have been widely deployed in various application scenarios. In order to extend the applications' boundaries to some accuracy-crucial domains, researchers have been investigating approaches to boost accuracy through either ...
Bao, Chenglong +5 more
core +1 more source
Global-Local Self-Distillation for Visual Representation Learning
The downstream accuracy of self-supervised methods is tightly linked to the proxy task solved during training and the quality of the gradients extracted from it. Richer and more meaningful gradients updates are key to allow self-supervised methods to learn better and in a more efficient manner.
Lebailly, Tim, Tuytelaars, Tinne
openaire +2 more sources
This study addresses the problem of the unsupervised pre‐training of video representation learning. The authors' focus is on two common approaches: knowledge distillation and self‐supervised learning.
Wei Zhou +3 more
doaj +1 more source

