Results 21 to 30 of about 142,324 (226)

A Deep Hierarchical Approach to Lifelong Learning in Minecraft

open access: yes, 2016
We propose a lifelong learning system that has the ability to reuse and transfer knowledge from one task to another while efficiently retaining the previously learned knowledge-base.
Givony, Shahar   +4 more
core   +1 more source

Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation

open access: yes, 2019
Convolutional neural networks have been widely deployed in various application scenarios. In order to extend the applications' boundaries to some accuracy-crucial domains, researchers have been investigating approaches to boost accuracy through either ...
Bao, Chenglong   +5 more
core   +1 more source

Decoupled Time-Dimensional Progressive Self-Distillation With Knowledge Calibration for Edge Computing-Enabled AIoT

open access: yesIEEE Access
The time-dimensional self-distillation seeks to transfer knowledge from earlier historical models to subsequent ones with minimal computational overhead.
Yingchao Wang, Wenqi Niu, Hanpo Hou
doaj   +1 more source

Building and road detection from remote sensing images based on weights adaptive multi-teacher collaborative distillation using a fused knowledge

open access: yesInternational Journal of Applied Earth Observations and Geoinformation, 2023
Knowledge distillation is one effective approach to compress deep learning models. However, the current distillation methods are relatively monotonous. There are still rare studies about the combination of distillation strategies using multiple types of ...
Ziyi Chen   +5 more
doaj   +1 more source

Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons

open access: yes, 2018
An activation boundary for a neuron refers to a separating hyperplane that determines whether the neuron is activated or deactivated. It has been long considered in neural networks that the activations of neurons, rather than their exact output values ...
Choi, Jin Young   +3 more
core   +1 more source

MKD: Mixup-Based Knowledge Distillation for Mandarin End-to-End Speech Recognition

open access: yesAlgorithms, 2022
Large-scale automatic speech recognition model has achieved impressive performance. However, huge computational resources and massive amount of data are required to train an ASR model.
Xing Wu   +4 more
doaj   +1 more source

Knowledge Distillation with Adversarial Samples Supporting Decision Boundary

open access: yes, 2018
Many recent works on knowledge distillation have provided ways to transfer the knowledge of a trained network for improving the learning process of a new one, but finding a good technique for knowledge distillation is still an open problem. In this paper,
Choi, Jin Young   +3 more
core   +1 more source

NeuRes: Highly Activated Neurons Responses Transfer via Distilling Sparse Activation Maps

open access: yesIEEE Access, 2022
In recent years, Knowledge Distillation has obtained a significant interest in mobile, edge, and IoT devices due to its ability to transfer knowledge from the large and complex teacher to the lightweight student network.
Sharmen Akhter   +3 more
doaj   +1 more source

Progressive Label Distillation: Learning Input-Efficient Deep Neural Networks [PDF]

open access: yes, 2019
Much of the focus in the area of knowledge distillation has been on distilling knowledge from a larger teacher network to a smaller student network. However, there has been little research on how the concept of distillation can be leveraged to distill ...
Lin, Zhong Qiu, Wong, Alexander
core   +2 more sources

Mycobacterial cell division arrest and smooth‐to‐rough envelope transition using CRISPRi‐mediated genetic repression systems

open access: yesFEBS Open Bio, EarlyView.
CRISPRI‐mediated gene silencing and phenotypic exploration in nontuberculous mycobacteria. In this Research Protocol, we describe approaches to control, monitor, and quantitatively assess CRISPRI‐mediated gene silencing in M. smegmatis and M. abscessus model organisms.
Vanessa Point   +7 more
wiley   +1 more source

Home - About - Disclaimer - Privacy