Results 311 to 320 of about 4,930,132 (374)
Some of the next articles are maybe not open access.

Tolerant Self-Distillation for image classification

Neural Networks
Deep neural networks tend to suffer from the overfitting issue when the training data are not enough. In this paper, we introduce two metrics from the intra-class distribution of correct-predicted and incorrect-predicted samples to provide a new perspective on the overfitting issue.
Mushui Liu   +4 more
openaire   +3 more sources

Restructuring the Teacher and Student in Self-Distillation

IEEE Transactions on Image Processing
Knowledge distillation aims to achieve model compression by transferring knowledge from complex teacher models to lightweight student models. To reduce reliance on pre-trained teacher models, self-distillation methods utilize knowledge from the model itself as additional supervision.
Yujie Zheng   +5 more
openaire   +3 more sources

Single-Domain Generalized Object Detection in Urban Scene via Cyclic-Disentangled Self-Distillation

Computer Vision and Pattern Recognition, 2022
In this paper, we are concerned with enhancing the generalization capability of object detectors. And we consider a realistic yet challenging scenario, namely Single-Domain Generalized Object Detection (Single-DGOD), which aims to learn an object ...
Aming Wu, Cheng Deng
semanticscholar   +1 more source

CODI: Compressing Chain-of-Thought into Continuous Space via Self-Distillation

Conference on Empirical Methods in Natural Language Processing
Chain-of-Thought (CoT) reasoning enhances Large Language Models (LLMs) by encouraging step-by-step reasoning in natural language. However, leveraging a latent continuous space for reasoning may offer benefits in terms of both efficiency and robustness ...
Zhenyi Shen   +5 more
semanticscholar   +1 more source

How to build a consistency model: Learning flow maps via self-distillation

arXiv.org
Flow-based generative models achieve state-of-the-art sample quality, but require the expensive solution of a differential equation at inference time.
N. Boffi   +2 more
semanticscholar   +1 more source

M3-Embedding: Multi-Linguality, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation

Annual Meeting of the Association for Computational Linguistics
In this paper, we introduce a new embedding model called M3-Embedding, which is distinguished for its versatility in \textit{Multi-Linguality}, \textit{Multi-Functionality}, and \textit{Multi-Granularity}.
Jianlv Chen   +5 more
semanticscholar   +1 more source

Perturbed Self-Distillation: Weakly Supervised Large-Scale Point Cloud Semantic Segmentation

IEEE International Conference on Computer Vision, 2021
Large-scale point cloud semantic segmentation has wide applications. Current popular researches mainly focus on fully supervised learning which demands expensive and tedious manual point-wise annotation.
Yachao Zhang   +5 more
semanticscholar   +1 more source

DISD-Net: A Dynamic Interactive Network With Self-Distillation for Cross-Subject Multi-Modal Emotion Recognition

IEEE transactions on multimedia
Multi-modal Emotion Recognition (MER) has demonstrated competitive performance in affective computing, owing to synthesizing information from diverse modalities.
Cheng Cheng   +4 more
semanticscholar   +1 more source

Self-Distillation for Few-Shot Image Captioning

2021 IEEE Winter Conference on Applications of Computer Vision (WACV), 2021
The development of large-scale image-captioning datasets is expensive, while the abundance of unpaired images and text corpus can potentially help reduce the efforts of manual annotation. In this paper, we study the few-shot image captioning problem that only requires a small amount of annotated image-caption pairs.
Xianyu Chen, Ming Jiang, Qi Zhao
openaire   +1 more source

Online Self-Distillation and Self-Modeling for 3D Brain Tumor Segmentation

IEEE journal of biomedical and health informatics
In the specialized domain of brain tumor segmentation, supervised segmentation approaches are hindered by the limited availability of high-quality labeled data, a condition arising from data privacy concerns, significant costs, and ethical issues.
Yan Pang   +10 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy