Results 21 to 30 of about 4,930,132 (374)

TTD: Text-Tag Self-Distillation Enhancing Image-Text Alignment in CLIP to Alleviate Single Tag Bias [PDF]

open access: green
We identify a critical bias in contemporary CLIP-based models, which we denote as single tag bias. This bias manifests as a disproportionate focus on a singular tag (word) while neglecting other pertinent tags, stemming from CLIP's text embeddings that ...
Sanghyun Jo   +4 more
openalex   +3 more sources

Similarity and Consistency by Self-distillation Method [PDF]

open access: yesJisuanji kexue, 2023
Due to high data pre-processing costs and missing local features detection in self-distillation methods for models compression,a similarity and consistency by self-distillation(SCD) method is proposed to improve model classification accuracy.Firstly ...
WAN Xu, MAO Yingchi, WANG Zibo, LIU Yi, PING Ping
doaj   +1 more source

Stepwise self-knowledge distillation for skin lesion image classification [PDF]

open access: yesScientific Reports
Self-knowledge distillation, which involves using the same network structure for both the teacher and student models, has gained considerable attention in the field of medical image classification.
Jian Zheng   +4 more
doaj   +2 more sources

Emerging Properties in Self-Supervised Vision Transformers [PDF]

open access: yesIEEE International Conference on Computer Vision, 2021
In this paper, we question if self-supervised learning provides new properties to Vision Transformer (ViT) [16] that stand out compared to convolutional networks (convnets). Beyond the fact that adapting self-supervised methods to this architecture works
Mathilde Caron   +6 more
semanticscholar   +1 more source

Knowledge Distillation With Feature Self Attention

open access: yesIEEE Access, 2023
With the rapid development of deep learning technology, the size and performance of the network continuously grow, making network compression essential for commercial applications.
Sin-Gu Park, Dong-Joong Kang
doaj   +1 more source

A self‐distillation object segmentation method via frequency domain knowledge augmentation

open access: yesIET Computer Vision, 2023
Most self‐distillation methods need complex auxiliary teacher structures and require lots of training samples in object segmentation task. To solve this challenging, a self‐distillation object segmentation method via frequency domain knowledge ...
Lei Chen   +3 more
doaj   +1 more source

Salt structure identification based on U-net model with target flip, multiple distillation and self-distillation methods

open access: yesFrontiers in Earth Science, 2023
Salt structures are crucial targets in oil and gas seismic exploitation so that one fast, automatic and accurate method is necessary for accelerating salt structure identification in the exploitation process.
Keran Li   +6 more
doaj   +1 more source

Cervical Cell Image Classification-Based Knowledge Distillation

open access: yesBiomimetics, 2022
Current deep-learning-based cervical cell classification methods suffer from parameter redundancy and poor model generalization performance, which creates challenges for the intelligent classification of cervical cytology smear images.
Wenjian Gao   +5 more
doaj   +1 more source

Self-Distilled Self-supervised Representation Learning

open access: yes2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2023
WACV 23, 11 ...
Jang, Jiho   +5 more
openaire   +2 more sources

On Self-Distilling Graph Neural Network [PDF]

open access: yesProceedings of the Thirtieth International Joint Conference on Artificial Intelligence, 2021
Recently, the teacher-student knowledge distillation framework has demonstrated its potential in training Graph Neural Networks (GNNs). However, due to the difficulty of training over-parameterized GNN models, one may not easily obtain a satisfactory teacher model for distillation.
Chen, Yuzhao   +5 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy