Results 1 to 10 of about 140,943 (151)

Decoupled Classifier Knowledge Distillation. [PDF]

open access: yesPLoS ONE
Mainstream knowledge distillation methods primarily include self-distillation, offline distillation, online distillation, output-based distillation, and feature-based distillation.
Hairui Wang   +3 more
doaj   +4 more sources

Memory-Replay Knowledge Distillation [PDF]

open access: yesSensors, 2021
Knowledge Distillation (KD), which transfers the knowledge from a teacher to a student network by penalizing their Kullback–Leibler (KL) divergence, is a widely used tool for Deep Neural Network (DNN) compression in intelligent sensor systems ...
Jiyue Wang, Pei Zhang, Yanxiong Li
doaj   +3 more sources

Discriminator-Enhanced Knowledge-Distillation Networks

open access: yesApplied Sciences, 2023
Query auto-completion (QAC) serves as a critical functionality in contemporary textual search systems by generating real-time query completion suggestions based on a user’s input prefix. Despite the prevalent use of language models (LMs) in QAC candidate
Zhenping Li   +4 more
doaj   +3 more sources

MTAKD: multi-teacher agreement knowledge distillation for edge AI skin disease diagnosis [PDF]

open access: yesScientific Reports
Skin disease diagnosis remains challenging in remote areas due to limited access to dermatology specialists and unreliable internet connectivity. Edge AI offers a potential solution by offloading the inference process from cloud servers to mobile devices.
Andreas Winata   +4 more
doaj   +2 more sources

A contrast enhanced representation normalization approach to knowledge distillation [PDF]

open access: yesScientific Reports
Within the research scope of knowledge distillation, contrastive representation distillation has achieved remarkable research results by introducing the Contrastive Representation Distillation Loss.
Zhiqiang Bao, Di Zhu, Leying Du, Yang Li
doaj   +2 more sources

Stepwise self-knowledge distillation for skin lesion image classification [PDF]

open access: yesScientific Reports
Self-knowledge distillation, which involves using the same network structure for both the teacher and student models, has gained considerable attention in the field of medical image classification.
Jian Zheng   +4 more
doaj   +2 more sources

Multistage feature fusion knowledge distillation

open access: yesScientific Reports
Generally, the recognition performance of lightweight models is often lower than that of large models. Knowledge distillation, by teaching a student model using a teacher model, can further enhance the recognition accuracy of lightweight models.
Gang Li   +5 more
doaj   +3 more sources

High efficiency classification of thyroid cytopathological images based on knowledge distillation and vision transformer [PDF]

open access: yesScientific Reports
Thyroid cancer is one of the most common types of cancer, pathological diagnosis based on Fine Needle Aspiration Cytology is clinically used as the standard for assessing thyroid cancer.
Jiazhe Zhang   +8 more
doaj   +2 more sources

Mutual Learning Knowledge Distillation Based on Multi-stage Multi-generative Adversarial Network [PDF]

open access: yesJisuanji kexue, 2022
Aiming at the problems of insufficient knowledge distillation efficiency,single stage training methods,complex training processes and difficult convergence of traditional knowledge distillation methods in image classification tasks,this paper designs a ...
HUANG Zhong-hao, YANG Xing-yao, YU Jiong, GUO Liang, LI Xiang
doaj   +1 more source

Knowledge Distillation: A Survey [PDF]

open access: yesInternational Journal of Computer Vision, 2021
In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its scalability to encode large-scale data and to maneuver billions of model parameters.
Jianping Gou   +3 more
openaire   +3 more sources

Home - About - Disclaimer - Privacy