Results 51 to 60 of about 842,651 (374)

Deep Error-Correcting Output Codes

open access: yesAlgorithms, 2023
Ensemble learning, online learning and deep learning are very effective and versatile in a wide spectrum of problem domains, such as feature extraction, multi-class classification and retrieval.
Li-Na Wang   +4 more
doaj   +1 more source

SAR Target Incremental Recognition Based on Hybrid Loss Function and Class-Bias Correction

open access: yesApplied Sciences, 2022
The Synthetic Aperture Radar (SAR) target recognition model usually needs to be retrained with all the samples when there are new-coming samples of new targets.
Yongsheng Zhou   +4 more
doaj   +1 more source

Unified Probabilistic Deep Continual Learning through Generative Replay and Open Set Recognition

open access: yes, 2020
We introduce a probabilistic approach to unify open set recognition with the prevention of catastrophic forgetting in deep continual learning, based on variational Bayesian inference.
Hong, Yong Won   +4 more
core   +1 more source

PyCIL: a Python toolbox for class-incremental learning

open access: yesScience China Information Sciences, 2023
Traditional machine learning systems are deployed under the closed-world setting, which requires the entire training data before the offline training process. However, real-world applications often face the incoming new classes, and a model should incorporate them continually. The learning paradigm is called Class-Incremental Learning (CIL). We propose
Da-Wei Zhou 0001   +3 more
openaire   +2 more sources

Exemplar-Supported Representation for Effective Class-Incremental Learning

open access: yesIEEE Access, 2020
Catastrophic forgetting is a key challenge for class-incremental learning with deep neural networks, where the performance decreases considerably while dealing with long sequences of new classes.
Lei Guo   +3 more
doaj   +1 more source

Generative Feature Replay For Class-Incremental Learning [PDF]

open access: yes2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2020
Humans are capable of learning new tasks without forgetting previous ones, while neural networks fail due to catastrophic forgetting between new and previously-learned tasks. We consider a class-incremental setting which means that the task-ID is unknown at inference time.
Xialei Liu   +7 more
openaire   +3 more sources

Incremental multiple objective genetic algorithms [PDF]

open access: yes, 2004
This paper presents a new genetic algorithm approach to multi-objective optimization problemsIncremental Multiple Objective Genetic Algorithms (IMOGA).
Chen, Q, Guan, SU
core   +1 more source

Class-incremental Learning using a Sequence of Partial Implicitly Regularized Classifiers

open access: yesProceedings of the International Florida Artificial Intelligence Research Society Conference, 2022
In class-incremental learning, the objective is to learn a number of classes sequentially without having access to the whole training data. However, due to a problem known as catastrophic forgetting, neural networks suffer substantial performance drop in
Sobirdzhon Bobiev   +3 more
doaj   +1 more source

End-to-end Incremental Learning [PDF]

open access: yes, 2018
Although deep learning approaches have stood out in recent years due to their state-of-the-art results, they continue to suffer from (catastrophic forgetting), a dramatic decrease in overall performance when training with new classes added incrementally.
Alahari, Karteek   +4 more
core   +4 more sources

Active Class Incremental Learning for Imbalanced Datasets [PDF]

open access: yes, 2020
Accepted in IPCV workshop from ...
Eden Belouadah   +3 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy