Results 1 to 10 of about 14,848 (289)

Monocular Depth Estimation via Self-Supervised Self-Distillation [PDF]

open access: yesSensors
Self-supervised monocular depth estimation can exhibit excellent performance in static environments due to the multi-view consistency assumption during the training process.
Haifeng Hu   +4 more
doaj   +4 more sources

A non-negative feedback self-distillation method for salient object detection [PDF]

open access: yesPeerJ Computer Science, 2023
Self-distillation methods utilize Kullback-Leibler divergence (KL) loss to transfer the knowledge from the network itself, which can improve the model performance without increasing computational resources and complexity. However, when applied to salient
Lei Chen   +6 more
doaj   +3 more sources

Relation-based self-distillation method for 2D object detection [PDF]

open access: yesScientific Reports
The challenge of enhancing the detection accuracy of widely adopted and stable object detectors while maintaining cost-effectiveness has long been a topic of significant interest and concern within the industry.
Bei Wang   +4 more
doaj   +2 more sources

An improved ShuffleNetV2 method based on ensemble self-distillation for tomato leaf diseases recognition [PDF]

open access: yesFrontiers in Plant Science
IntroductionTimely and accurate recognition of tomato diseases is crucial for improving tomato yield. While large deep learning models can achieve high-precision disease recognition, these models often have a large number of parameters, making them ...
Shuiping Ni   +7 more
doaj   +2 more sources

Robust federated learning for UAV object detection: a joint self-distillation and drift compensation approach [PDF]

open access: yesFrontiers in Neurorobotics
The rapid advancement of unmanned aerial vehicles (UAVs) in disaster response and environmental monitoring has underscored the growing importance of real-time object detection within UAV swarm networks.
Yu Hangsun   +4 more
doaj   +2 more sources

Hierarchical Self-Distillation with Attention for Class-Imbalanced Acoustic Event Classification in Elevators [PDF]

open access: yesSensors
Acoustic-based anomaly detection in elevators is crucial for predictive maintenance and operational safety, yet it faces significant challenges in real-world settings, including pervasive multi-source acoustic interference within confined spaces and ...
Shengying Yang   +5 more
doaj   +2 more sources

DualDistill: a dual-guided self-distillation approach for carotid plaque analysis [PDF]

open access: yesFrontiers in Medicine
Accurate classification of carotid plaques is critical to assessing the risk of cardiovascular disease. However, this task remains challenging due to several factors: temporal discontinuity caused by probe motion, the small size of plaques combined with ...
Xiaoman Zhang   +3 more
doaj   +2 more sources

InfoMSD: an information-maximization self-distillation framework for parameter-efficient fine-tuning on artwork images [PDF]

open access: yesFrontiers in Artificial Intelligence
In recent years, despite the remarkable performance of large-scale vision language models across various visual classification tasks, their substantial parameter counts and high fine-tuning costs have hindered deployment in resource-constrained cultural ...
Feng Guan   +3 more
doaj   +2 more sources

SAF-SD: Self-Distillation Object Segmentation Method Based on Sequential Three-Way Mask and Attention Fusion [PDF]

open access: yesSensors
Transformer models have achieved powerful performance in various computer vision tasks. However, their black-box nature severely limits model interpretability and the reliability of real-world applications.
Biao Wang   +3 more
doaj   +2 more sources

Efficient Semantic Segmentation via Self-Attention and Self-Distillation [PDF]

open access: yesIEEE Transactions on Intelligent Transportation Systems, 2022
Lightweight models are pivotal in efficient semantic segmentation, but they often suffer from insufficient context information due to limited convolution and small receptive field.
Shumin An, Jing-Hao Xue
exaly   +2 more sources

Home - About - Disclaimer - Privacy