Catastrophic forgetting: still a problem for DNNs [PDF]
We investigate the performance of DNNs when trained on class-incremental visual problems consisting of initial training, followed by retraining with added visual classes.
Abdullah, S. +3 more
core +2 more sources
Combating catastrophic forgetting with developmental compression [PDF]
Generally intelligent agents exhibit successful behavior across problems in several settings. Endemic in approaches to realize such intelligence in machines is catastrophic forgetting: sequential learning corrupts knowledge obtained earlier in the ...
Bongard J. +4 more
core +2 more sources
Measuring Catastrophic Forgetting in Neural Networks
Deep neural networks are used in many state-of-the-art systems for machine perception. Once a network is trained to do a specific task, e.g., bird classification, it cannot easily be trained to do new tasks, e.g., incrementally learning to recognize ...
Abitino, Angelina +4 more
core +2 more sources
Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization [PDF]
Nicolas Y Masse, David J Freedman
exaly +2 more sources
Domain-incremental white blood cell classification with privacy-aware continual learning [PDF]
White blood cell (WBC) classification plays a vital role in hematology for diagnosing various medical conditions. However, it faces significant challenges due to domain shifts caused by variations in sample sources (e.g., blood or bone marrow) and ...
Pratibha Kumari +6 more
doaj +2 more sources
Overcoming catastrophic forgetting in neural networks. [PDF]
Significance Deep neural networks are currently the most successful machine-learning technique for solving a variety of tasks, including language translation, image classification, and image generation. One weakness of such models is that, unlike humans, they are unable to learn multiple tasks sequentially.
Kirkpatrick J +13 more
europepmc +6 more sources
Do not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting [PDF]
One of the major limitations of deep learning models is that they face catastrophic forgetting in an incremental learning scenario. There have been several approaches proposed to tackle the problem of incremental learning. Most of these methods are based on knowledge distillation and do not adequately utilize the information provided by older task ...
Vinod K. Kurmi +3 more
openaire +2 more sources
Variational Continuous Bayesian Meta-learning Based Algorithm for Recommendation [PDF]
Meta-learning methods have been introduced into recommendation algorithms in recent years to alleviate the problem of cold start.The existing meta-learning algorithms can only improve the ability of the algorithm to deal with a set of statically ...
ZHU Wentao, LIU Wei, LIANG Shangsong, ZHU Huaijie, YIN Jian
doaj +1 more source
Incremental Fault Diagnosis Method Based on Metric Feature Distillation and Improved Sample Memory
Incremental learning-based fault diagnosis systems (IFD) are widely used because of their ability to handle constantly updated fault data and types. However, the catastrophic forgetting problem remains the most crucial contemporary challenge facing IFD ...
Qilang Min +3 more
doaj +1 more source
On Sequential Bayesian Inference for Continual Learning
Sequential Bayesian inference can be used for continual learning to prevent catastrophic forgetting of past tasks and provide an informative prior when learning new tasks.
Samuel Kessler +4 more
doaj +1 more source

