Results 11 to 20 of about 142,324 (226)
Feature fusion-based collaborative learning for knowledge distillation
Deep neural networks have achieved a great success in a variety of applications, such as self-driving cars and intelligent robotics. Meanwhile, knowledge distillation has received increasing attention as an effective model compression technique for ...
Yiting Li +4 more
doaj +1 more source
Knowledge distillation in deep learning and its applications [PDF]
Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices.
Abdolmaged Alkhulaifi +2 more
doaj +2 more sources
Reverse Self-Distillation Overcoming the Self-Distillation Barrier
Deep neural networks generally cannot gather more helpful information with limited data in image classification, resulting in poor performance. Self-distillation, as a novel knowledge distillation technique, integrates the roles of teacher and student ...
Shuiping Ni +4 more
doaj +1 more source
Multi-assistant Dynamic Setting Method for Knowledge Distillation [PDF]
Knowledge distillation is increasingly gaining attention in key areas such as model compression for object recognition.Through in-depth research into the efficiency of knowledge distillation and an analysis of the characteristics of knowledge transfer ...
SI Yuehang, CHENG Qing, HUANG Jincai
doaj +1 more source
A novel model compression method based on joint distillation for deepfake video detection
In recent years, deepfake videos have been abused to create fake news, which threaten the integrity of digital videos. Although existing detection methods leveraged cumbersome neural networks to achieve promising detection performance, they cannot be ...
Xiong Xu +5 more
doaj +1 more source
Forest Fire Object Detection Analysis Based on Knowledge Distillation
This paper investigates the application of the YOLOv7 object detection model combined with knowledge distillation techniques in forest fire detection.
Jinzhou Xie, Hongmin Zhao
doaj +1 more source
Sub-Band Knowledge Distillation Framework for Speech Enhancement
In single-channel speech enhancement, methods based on full-band spectral features have been widely studied. However, only a few methods pay attention to non-full-band spectral features.
Gao, Guanglai +5 more
core +1 more source
Discriminator-Enhanced Knowledge-Distillation Networks
Query auto-completion (QAC) serves as a critical functionality in contemporary textual search systems by generating real-time query completion suggestions based on a user’s input prefix. Despite the prevalent use of language models (LMs) in QAC candidate
Zhenping Li +4 more
doaj +1 more source
A non-negative feedback self-distillation method for salient object detection [PDF]
Self-distillation methods utilize Kullback-Leibler divergence (KL) loss to transfer the knowledge from the network itself, which can improve the model performance without increasing computational resources and complexity. However, when applied to salient
Lei Chen +6 more
doaj +2 more sources
Feature Fusion for Online Mutual Knowledge Distillation
We propose a learning framework named Feature Fusion Learning (FFL) that efficiently trains a powerful classifier through a fusion module which combines the feature maps generated from parallel neural networks. Specifically, we train a number of parallel
Chung, Inseop +3 more
core +1 more source

