Results 31 to 40 of about 141,062 (268)

ResKD: Residual-Guided Knowledge Distillation [PDF]

open access: yesIEEE Transactions on Image Processing, 2021
Knowledge distillation, aimed at transferring the knowledge from a heavy teacher network to a lightweight student network, has emerged as a promising technique for compressing neural networks. However, due to the capacity gap between the heavy teacher and the lightweight student, there still exists a significant performance gap between them.
Xuewei Li   +4 more
openaire   +3 more sources

Feature Fusion for Online Mutual Knowledge Distillation

open access: yes, 2020
We propose a learning framework named Feature Fusion Learning (FFL) that efficiently trains a powerful classifier through a fusion module which combines the feature maps generated from parallel neural networks. Specifically, we train a number of parallel
Chung, Inseop   +3 more
core   +1 more source

Meta Knowledge Distillation

open access: yes, 2022
Recent studies pointed out that knowledge distillation (KD) suffers from two degradation problems, the teacher-student gap and the incompatibility with strong data augmentations, making it not applicable to training state-of-the-art models, which are trained with advanced augmentations.
Liu, Jihao   +3 more
openaire   +2 more sources

Knowledge Distillation for Quality Estimation [PDF]

open access: yesFindings of the Association for Computational Linguistics: ACL-IJCNLP 2021, 2021
Quality Estimation (QE) is the task of automatically predicting Machine Translation quality in the absence of reference translations, making it applicable in real-time settings, such as translating online social media conversations. Recent success in QE stems from the use of multilingual pre-trained representations, where very large models lead to ...
Gajbhiye, Amit   +6 more
openaire   +3 more sources

Biocuration: Distilling data into knowledge [PDF]

open access: yesPLOS Biology, 2018
Data, including information generated from them by processing and analysis, are an asset with measurable value. The assets that biological research funding produces are the data generated, the information derived from these data, and, ultimately, the discoveries and knowledge these lead to.
openaire   +6 more sources

A Deep Hierarchical Approach to Lifelong Learning in Minecraft

open access: yes, 2016
We propose a lifelong learning system that has the ability to reuse and transfer knowledge from one task to another while efficiently retaining the previously learned knowledge-base.
Givony, Shahar   +4 more
core   +1 more source

Faithful Knowledge Distillation

open access: yes, 2023
Knowledge distillation (KD) has received much attention due to its success in compressing networks to allow for their deployment in resource-constrained systems. While the problem of adversarial robustness has been studied before in the KD setting, previous works overlook what we term the relative calibration of the student network with respect to its ...
Lamb, Tom A.   +5 more
openaire   +2 more sources

Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation

open access: yes, 2019
Convolutional neural networks have been widely deployed in various application scenarios. In order to extend the applications' boundaries to some accuracy-crucial domains, researchers have been investigating approaches to boost accuracy through either ...
Bao, Chenglong   +5 more
core   +1 more source

Decoupled Knowledge Distillation

open access: yes2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022
Accepted by CVPR2022, fix ...
Zhao, Borui   +4 more
openaire   +2 more sources

Decoupled Time-Dimensional Progressive Self-Distillation With Knowledge Calibration for Edge Computing-Enabled AIoT

open access: yesIEEE Access
The time-dimensional self-distillation seeks to transfer knowledge from earlier historical models to subsequent ones with minimal computational overhead.
Yingchao Wang, Wenqi Niu, Hanpo Hou
doaj   +1 more source

Home - About - Disclaimer - Privacy