Results 11 to 20 of about 514,020 (257)

Defensive Few-shot Learning

open access: yesIEEE Transactions on Pattern Analysis and Machine Intelligence, 2022
This paper investigates a new challenging problem called defensive few-shot learning in order to learn a robust few-shot model against adversarial attacks. Simply applying the existing adversarial defense methods to few-shot learning cannot effectively solve this problem.
Wenbin Li   +6 more
openaire   +4 more sources

Few-Shot Learning With Geometric Constraints [PDF]

open access: yesIEEE Transactions on Neural Networks and Learning Systems, 2020
In this article, we consider the problem of few-shot learning for classification. We assume a network trained for base categories with a large number of training examples, and we aim to add novel categories to it that have only a few, e.g., one or five, training examples.
Hong-Gyu Jung, Seong-Whan Lee
openaire   +3 more sources

Few-Shot Partial Multi-View Learning

open access: yesIEEE Transactions on Pattern Analysis and Machine Intelligence, 2023
It is often the case that data are with multiple views in real-world applications. Fully exploring the information of each view is significant for making data more representative. However, due to various limitations and failures in data collection and pre-processing, it is inevitable for real data to suffer from view missing and data scarcity.
Yuan Zhou   +4 more
openaire   +3 more sources

Few-Shot Learning With Class Imbalance

open access: yesIEEE Transactions on Artificial Intelligence, 2023
Few-Shot Learning (FSL) algorithms are commonly trained through Meta-Learning (ML), which exposes models to batches of tasks sampled from a meta-dataset to mimic tasks seen during evaluation. However, the standard training procedures overlook the real-world dynamics where classes commonly occur at different frequencies. While it is generally understood
Mateusz Ochal   +4 more
openaire   +4 more sources

Incrementally Learned Angular Representations for Few-Shot Class-Incremental Learning

open access: yesIEEE Access, 2023
The main challenge of FSCIL is the trade-off between underfitting to a new session task and preventing forgetting the knowledge for earlier sessions. In this paper, we reveal that the angular space occupied by the features within the embedded area is ...
In-Ug Yoon, Jong-Hwan Kim
doaj   +1 more source

Few‐shot learning with relation propagation and constraint

open access: yesIET Computer Vision, 2021
Previous deep learning methods usually required large‐scale annotated data, which is computationally exhaustive and unrealistic in certain scenarios. Therefore, few‐shot learning, where only a few annotated training images are available for training, has
Huiyun Gong   +6 more
doaj   +1 more source

Few-Shot Lifelong Learning

open access: yesProceedings of the AAAI Conference on Artificial Intelligence, 2021
Many real-world classification problems often have classes with very few labeled training samples. Moreover, all possible classes may not be initially available for training, and may be given incrementally. Deep learning models need to deal with this two-fold problem in order to perform well in real-life situations.
Mazumder, Pratik   +2 more
openaire   +2 more sources

Federated Few-shot Learning

open access: yesProceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2023
Federated Learning (FL) enables multiple clients to collaboratively learn a machine learning model without exchanging their own local data. In this way, the server can exploit the computational power of all clients and train the model on a larger set of data samples among all clients.
Song Wang   +5 more
openaire   +2 more sources

Few-Shot Learning for Opinion Summarization [PDF]

open access: yesProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
Opinion summarization is the automatic creation of text reflecting subjective information expressed in multiple documents, such as user reviews of a product. The task is practically important and has attracted a lot of attention. However, due to the high cost of summary production, datasets large enough for training supervised models are lacking ...
Bražinskas, A., Lapata, M., Titov, I.
openaire   +3 more sources

Augmenting Few-Shot Learning With Supervised Contrastive Learning

open access: yesIEEE Access, 2021
Few-shot learning deals with a small amount of data which incurs insufficient performance with conventional cross-entropy loss. We propose a pretraining approach for few-shot learning scenarios.
Taemin Lee, Sungjoo Yoo
doaj   +1 more source

Home - About - Disclaimer - Privacy