Results 21 to 30 of about 4,930,132 (374)
TTD: Text-Tag Self-Distillation Enhancing Image-Text Alignment in CLIP to Alleviate Single Tag Bias [PDF]
We identify a critical bias in contemporary CLIP-based models, which we denote as single tag bias. This bias manifests as a disproportionate focus on a singular tag (word) while neglecting other pertinent tags, stemming from CLIP's text embeddings that ...
Sanghyun Jo +4 more
openalex +3 more sources
Similarity and Consistency by Self-distillation Method [PDF]
Due to high data pre-processing costs and missing local features detection in self-distillation methods for models compression,a similarity and consistency by self-distillation(SCD) method is proposed to improve model classification accuracy.Firstly ...
WAN Xu, MAO Yingchi, WANG Zibo, LIU Yi, PING Ping
doaj +1 more source
Stepwise self-knowledge distillation for skin lesion image classification [PDF]
Self-knowledge distillation, which involves using the same network structure for both the teacher and student models, has gained considerable attention in the field of medical image classification.
Jian Zheng +4 more
doaj +2 more sources
Emerging Properties in Self-Supervised Vision Transformers [PDF]
In this paper, we question if self-supervised learning provides new properties to Vision Transformer (ViT) [16] that stand out compared to convolutional networks (convnets). Beyond the fact that adapting self-supervised methods to this architecture works
Mathilde Caron +6 more
semanticscholar +1 more source
Knowledge Distillation With Feature Self Attention
With the rapid development of deep learning technology, the size and performance of the network continuously grow, making network compression essential for commercial applications.
Sin-Gu Park, Dong-Joong Kang
doaj +1 more source
A self‐distillation object segmentation method via frequency domain knowledge augmentation
Most self‐distillation methods need complex auxiliary teacher structures and require lots of training samples in object segmentation task. To solve this challenging, a self‐distillation object segmentation method via frequency domain knowledge ...
Lei Chen +3 more
doaj +1 more source
Salt structures are crucial targets in oil and gas seismic exploitation so that one fast, automatic and accurate method is necessary for accelerating salt structure identification in the exploitation process.
Keran Li +6 more
doaj +1 more source
Cervical Cell Image Classification-Based Knowledge Distillation
Current deep-learning-based cervical cell classification methods suffer from parameter redundancy and poor model generalization performance, which creates challenges for the intelligent classification of cervical cytology smear images.
Wenjian Gao +5 more
doaj +1 more source
Self-Distilled Self-supervised Representation Learning
WACV 23, 11 ...
Jang, Jiho +5 more
openaire +2 more sources
On Self-Distilling Graph Neural Network [PDF]
Recently, the teacher-student knowledge distillation framework has demonstrated its potential in training Graph Neural Networks (GNNs). However, due to the difficulty of training over-parameterized GNN models, one may not easily obtain a satisfactory teacher model for distillation.
Chen, Yuzhao +5 more
openaire +2 more sources

