Results 21 to 30 of about 141,062 (268)
What Knowledge Gets Distilled in Knowledge Distillation?
Knowledge distillation aims to transfer useful information from a teacher network to a student network, with the primary goal of improving the student's performance for the task at hand. Over the years, there has a been a deluge of novel techniques and use cases of knowledge distillation.
Ojha, Utkarsh +4 more
openaire +2 more sources
Knowledge distillation in deep learning and its applications [PDF]
Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices.
Abdolmaged Alkhulaifi +2 more
doaj +2 more sources
Reverse Self-Distillation Overcoming the Self-Distillation Barrier
Deep neural networks generally cannot gather more helpful information with limited data in image classification, resulting in poor performance. Self-distillation, as a novel knowledge distillation technique, integrates the roles of teacher and student ...
Shuiping Ni +4 more
doaj +1 more source
SMKD: Selective Mutual Knowledge Distillation
Mutual knowledge distillation (MKD) is a technique used to transfer knowledge between multiple models in a collaborative manner. However, it is important to note that not all knowledge is accurate or reliable, particularly under challenging conditions such as label noise, which can lead to models that memorize undesired information. This problem can be
Li, Z +5 more
openaire +2 more sources
Multi-assistant Dynamic Setting Method for Knowledge Distillation [PDF]
Knowledge distillation is increasingly gaining attention in key areas such as model compression for object recognition.Through in-depth research into the efficiency of knowledge distillation and an analysis of the characteristics of knowledge transfer ...
SI Yuehang, CHENG Qing, HUANG Jincai
doaj +1 more source
A novel model compression method based on joint distillation for deepfake video detection
In recent years, deepfake videos have been abused to create fake news, which threaten the integrity of digital videos. Although existing detection methods leveraged cumbersome neural networks to achieve promising detection performance, they cannot be ...
Xiong Xu +5 more
doaj +1 more source
Distilling Knowledge by Mimicking Features [PDF]
To appear in IEEE Trans ...
Wang, Guo-Hua, Ge, Yifan, Wu, Jianxin
openaire +3 more sources
Sub-Band Knowledge Distillation Framework for Speech Enhancement
In single-channel speech enhancement, methods based on full-band spectral features have been widely studied. However, only a few methods pay attention to non-full-band spectral features.
Gao, Guanglai +5 more
core +1 more source
Forest Fire Object Detection Analysis Based on Knowledge Distillation
This paper investigates the application of the YOLOv7 object detection model combined with knowledge distillation techniques in forest fire detection.
Jinzhou Xie, Hongmin Zhao
doaj +1 more source
A non-negative feedback self-distillation method for salient object detection [PDF]
Self-distillation methods utilize Kullback-Leibler divergence (KL) loss to transfer the knowledge from the network itself, which can improve the model performance without increasing computational resources and complexity. However, when applied to salient
Lei Chen +6 more
doaj +2 more sources

