Results 61 to 70 of about 4,930,132 (374)
Self-Distillation for Randomized Neural Networks
Knowledge distillation (KD) is a conventional method in the field of deep learning that enables the transfer of dark knowledge from a teacher model to a student model, consequently improving the performance of the student model. In randomized neural networks, due to the simple topology of network architecture and the insignificant relationship between ...
Minghui Hu +2 more
openaire +4 more sources
Orchestrate Latent Expertise: Advancing Online Continual Learning with Multi-Level Supervision and Reverse Self-Distillation [PDF]
To accommodate real-world dynamics, artificial intelligence systems need to cope with sequentially arriving content in an online manner. Beyond regular Continual Learning (CL) attempting to address catastrophic forgetting with offline training of each ...
Hongwei Yan +3 more
semanticscholar +1 more source
Spatial Self-Distillation for Object Detection with Inaccurate Bounding Boxes
Object detection via inaccurate bounding boxes supervision has boosted a broad interest due to the expensive high-quality annotation data or the occasional inevitability of low annotation quality (\eg tiny objects).
Chen, Pengfei +5 more
core
The inhibition of mitochondrial dihydroorotate dehydrogenase (DHODH) impairs syncytialization and induces cellular senescence via mitochondrial and endoplasmic reticulum stress in human trophoblast stem cells, elevating sFlt1/PlGF levels, a hallmark of placental dysfunction in hypertensive disorders of pregnancy.
Kanoko Yoshida +6 more
wiley +1 more source
A Deep Semantic Segmentation Approach to Map Forest Tree Dieback in Sentinel-2 Data
Massive tree dieback events triggered by various disturbance agents, such as insect outbreaks, pests, fires, and windstorms, have recently compromised the health of forests in numerous countries with a significant impact on ecosystems.
Giuseppina Andresini +2 more
doaj +1 more source
Lower bounds for kernelizations [PDF]
"Vegeu el resum a l'inici del document del fitxer adjunt"
Chen, Yijia +3 more
core
This study uncovers the unexplored role of intermolecular interactions in multiphoton absorption in coordination polymers. By analyzing [Zn2tpda(DMA)2(DMF)0.3], it shows how the electronic coupling of the chromophores and confinement in the MOF enhance two‐and three‐photon absorption.
Simon Nicolas Deger +11 more
wiley +1 more source
Not All Voxels are Equal: Hardness-Aware Semantic Scene Completion with Self-Distillation [PDF]
Semantic scene completion, also known as semantic oc-cupancy prediction, can provide dense geometric and semantic information for autonomous vehicles, which attracts the increasing attention of both academia and industry. Un-fortunately, existing methods
Song Wang +6 more
semanticscholar +1 more source
Adaptive Similarity Bootstrapping for Self-Distillation
Most self-supervised methods for representation learning leverage a cross-view consistency objective i.e. they maximize the representation similarity of a given image's augmented views.
Bozorgtabar, Behzad +4 more
core
Semi-Supervised Learning under Class Distribution Mismatch [PDF]
Semi-supervised learning (SSL) aims to avoid the need for collecting prohibitively expensive labelled training data. Whilst demonstrating impressive performance boost, existing SSL methods artificially assume that small labelled data and large ...
Chen, Y +4 more
core +1 more source

