Results 1 to 10 of about 4,930,132 (374)

SILC: Improving Vision Language Pretraining with Self-Distillation [PDF]

open access: green, 2023
Image-Text pretraining on web-scale image caption dataset has become the default recipe for open vocabulary classification and retrieval models thanks to the success of CLIP and its variants. Several works have also used CLIP features for dense prediction tasks and have shown the emergence of open-set abilities.
Muhammad Ferjad Naeem   +5 more
core   +5 more sources

Monocular Depth Estimation via Self-Supervised Self-Distillation [PDF]

open access: yesSensors
Self-supervised monocular depth estimation can exhibit excellent performance in static environments due to the multi-view consistency assumption during the training process.
Haifeng Hu   +4 more
doaj   +4 more sources

Reverse Self-Distillation Overcoming the Self-Distillation Barrier

open access: yesIEEE Open Journal of the Computer Society, 2023
Deep neural networks generally cannot gather more helpful information with limited data in image classification, resulting in poor performance. Self-distillation, as a novel knowledge distillation technique, integrates the roles of teacher and student ...
Shuiping Ni   +4 more
doaj   +2 more sources

A non-negative feedback self-distillation method for salient object detection [PDF]

open access: yesPeerJ Computer Science, 2023
Self-distillation methods utilize Kullback-Leibler divergence (KL) loss to transfer the knowledge from the network itself, which can improve the model performance without increasing computational resources and complexity. However, when applied to salient
Lei Chen   +6 more
doaj   +3 more sources

Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation [PDF]

open access: yesIEEE International Conference on Computer Vision, 2019
Convolutional neural networks have been widely deployed in various application scenarios. In order to extend the applications' boundaries to some accuracy-crucial domains, researchers have been investigating approaches to boost accuracy through either ...
Linfeng Zhang   +5 more
semanticscholar   +3 more sources

Improving Differentiable Architecture Search Via Self-Distillation

open access: yesNeural Networks, 2023
Accepted by Neural ...
Xunyu Zhu   +3 more
openaire   +4 more sources

Relation-based self-distillation method for 2D object detection [PDF]

open access: yesScientific Reports
The challenge of enhancing the detection accuracy of widely adopted and stable object detectors while maintaining cost-effectiveness has long been a topic of significant interest and concern within the industry.
Bei Wang   +4 more
doaj   +2 more sources

An improved ShuffleNetV2 method based on ensemble self-distillation for tomato leaf diseases recognition [PDF]

open access: yesFrontiers in Plant Science
IntroductionTimely and accurate recognition of tomato diseases is crucial for improving tomato yield. While large deep learning models can achieve high-precision disease recognition, these models often have a large number of parameters, making them ...
Shuiping Ni   +7 more
doaj   +2 more sources

DualDistill: a dual-guided self-distillation approach for carotid plaque analysis [PDF]

open access: yesFrontiers in Medicine
Accurate classification of carotid plaques is critical to assessing the risk of cardiovascular disease. However, this task remains challenging due to several factors: temporal discontinuity caused by probe motion, the small size of plaques combined with ...
Xiaoman Zhang   +3 more
doaj   +2 more sources

Hierarchical Self-Distillation with Attention for Class-Imbalanced Acoustic Event Classification in Elevators [PDF]

open access: yesSensors
Acoustic-based anomaly detection in elevators is crucial for predictive maintenance and operational safety, yet it faces significant challenges in real-world settings, including pervasive multi-source acoustic interference within confined spaces and ...
Shengying Yang   +5 more
doaj   +2 more sources

Home - About - Disclaimer - Privacy