Results 321 to 330 of about 842,651 (374)
Some of the next articles are maybe not open access.
Class Incremental Learning With Task-Selection
2020 IEEE International Conference on Image Processing (ICIP), 2020Despite the success of the deep neural networks (DNNs), in case of incremental learning, DNNs are known to suffer from catastrophic forgetting problems which are the phenomenon of entirely forgetting previously learned task information upon learning current task information.
Eun Sung Kim +4 more
openaire +1 more source
Task-Agnostic Guided Feature Expansion for Class-Incremental Learning
Computer Vision and Pattern RecognitionThe ability to learn new concepts while preserve the learned knowledge is desirable for learning systems in Class-Incremental Learning (CIL). Recently, feature expansion of the model become a prevalent solution for CIL, where the old features are fixed ...
Bowen Zheng +3 more
semanticscholar +1 more source
CL-LoRA: Continual Low-Rank Adaptation for Rehearsal-Free Class-Incremental Learning
Computer Vision and Pattern RecognitionClass-Incremental Learning (CIL) aims to learn new classes sequentially while retaining the knowledge of previously learned classes. Recently, pre-trained models (PTMs) combined with parameter-efficient fine-tuning (PEFT) have shown remarkable ...
Jiangpeng He, Zhihao Duan, F. Zhu
semanticscholar +1 more source
Distilled Meta-learning for Multi-Class Incremental Learning
ACM Transactions on Multimedia Computing, Communications, and Applications, 2023Meta-learning approaches have recently achieved promising performance in multi-class incremental learning. However, meta-learners still suffer from catastrophic forgetting, i.e., they tend to forget the learned knowledge from the old tasks when they focus on rapidly adapting to the new classes of the current task.
Hao Liu 0065 +5 more
openaire +1 more source
Few-Shot Class-Incremental Learning via Class-Aware Bilateral Distillation
Computer Vision and Pattern Recognition, 2023Few-Shot Class-Incremental Learning (FSCIL) aims to continually learn novel classes based on only few training samples, which poses a more challenging task than the well-studied Class-Incremental Learning (CIL) due to data scarcity.
Linglan Zhao +6 more
semanticscholar +1 more source
Topology-Preserving Class-Incremental Learning
2020A well-known issue for class-incremental learning is the catastrophic forgetting phenomenon, where the network’s recognition performance on old classes degrades severely when incrementally learning new classes. To alleviate forgetting, we put forward to preserve the old class knowledge by maintaining the topology of the network’s feature space. On this
Xiaoyu Tao +4 more
openaire +1 more source
MalFSCIL: A Few-Shot Class-Incremental Learning Approach for Malware Detection
IEEE Transactions on Information Forensics and SecurityThe continuous evolution of malware is posing a serious threat to personal privacy, enterprise data security, and global network infrastructure. For example, attackers can use phishing emails, botnets, etc.
Yuhan Chai +7 more
semanticscholar +1 more source
Mixture of Noise for Pre-Trained Model-Based Class-Incremental Learning
arXiv.orgClass Incremental Learning (CIL) aims to continuously learn new categories while retaining the knowledge of old ones. Pre-trained models (PTMs) show promising capabilities in CIL.
Kai Jiang +4 more
semanticscholar +1 more source
Prompt-Based Concept Learning for Few-Shot Class-Incremental Learning
IEEE transactions on circuits and systems for video technology (Print)Few-Shot Class-Incremental Learning (FSCIL) faces a huge stability-plasticity challenge due to continuously learning knowledge from new classes with a small number of training samples without forgetting the knowledge of previously seen old classes.
Shuo Li +6 more
semanticscholar +1 more source
External Knowledge Injection for CLIP-Based Class-Incremental Learning
IEEE International Conference on Computer VisionClass-Incremental Learning (CIL) enables learning systems to continuously adapt to evolving data streams. With the advancement of pre-training, leveraging pre-trained vision-language models (e.g., CLIP) offers a promising starting point for CIL. However,
Da-Wei Zhou +5 more
semanticscholar +1 more source

