Results 21 to 30 of about 14,848 (289)

Mind the Trade-off: Debiasing NLU Models without Degrading the In-distribution Performance [PDF]

open access: yes, 2020
Models for natural language understanding (NLU) tasks often rely on the idiosyncratic biases of the dataset, which make them brittle against test cases outside the training distribution.
Gurevych, Iryna   +2 more
core   +2 more sources

A Lightweight Graph Neural Network Algorithm for Action Recognition Based on Self-Distillation

open access: yesAlgorithms, 2023
Recognizing human actions can help in numerous ways, such as health monitoring, intelligent surveillance, virtual reality and human–computer interaction. A quick and accurate detection algorithm is required for daily real-time detection. This paper first
Miao Feng, Jean Meunier
doaj   +1 more source

Revisiting Self-Distillation

open access: yes, 2022
Knowledge distillation is the procedure of transferring "knowledge" from a large model (the teacher) to a more compact one (the student), often being used in the context of model compression. When both models have the same architecture, this procedure is called self-distillation.
Pham, Minh   +3 more
openaire   +2 more sources

Bayesian Optimization Meets Self-Distillation

open access: yes2023 IEEE/CVF International Conference on Computer Vision (ICCV), 2023
Bayesian optimization (BO) has contributed greatly to improving model performance by suggesting promising hyperparameter configurations iteratively based on observations from multiple training trials. However, only partial knowledge (i.e., the measured performances of trained models and their hyperparameter configurations) from previous trials is ...
Lee, HyunJae   +5 more
openaire   +2 more sources

Point Cloud Instance Segmentation with Inaccurate Bounding-Box Annotations

open access: yesSensors, 2023
Most existing point cloud instance segmentation methods require accurate and dense point-level annotations, which are extremely laborious to collect.
Yinyin Peng, Hui Feng, Tao Chen, Bo Hu
doaj   +1 more source

Self-distillation for Surgical Action Recognition

open access: yes, 2023
Surgical scene understanding is a key prerequisite for contextaware decision support in the operating room. While deep learning-based approaches have already reached or even surpassed human performance in various fields, the task of surgical action recognition remains a major challenge.
Yamlahi, Amine   +11 more
openaire   +2 more sources

Self-Distillation for Unsupervised 3D Domain Adaptation [PDF]

open access: yes, 2023
Point cloud classification is a popular task in 3D vision. However, previous works, usually assume that point clouds at test time are obtained with the same procedure or sensor as those at training time.
Adriano Cardace   +4 more
core   +1 more source

Salt structure identification based on U-net model with target flip, multiple distillation and self-distillation methods

open access: yesFrontiers in Earth Science, 2023
Salt structures are crucial targets in oil and gas seismic exploitation so that one fast, automatic and accurate method is necessary for accelerating salt structure identification in the exploitation process.
Keran Li   +6 more
doaj   +1 more source

Cervical Cell Image Classification-Based Knowledge Distillation

open access: yesBiomimetics, 2022
Current deep-learning-based cervical cell classification methods suffer from parameter redundancy and poor model generalization performance, which creates challenges for the intelligent classification of cervical cytology smear images.
Wenjian Gao   +5 more
doaj   +1 more source

GSC-MIM: Global semantic integrated self-distilled complementary masked image model for remote sensing images scene classification

open access: yesFrontiers in Ecology and Evolution, 2022
Masked image modeling (MIM) is a learning method in which the unmasked components of the input are utilized to learn and predict the masked signal, enabling learning from large amounts of unannotated data.
Xuying Wang   +4 more
doaj   +1 more source

Home - About - Disclaimer - Privacy