Domain-incremental white blood cell classification with privacy-aware continual learning [PDF]
White blood cell (WBC) classification plays a vital role in hematology for diagnosing various medical conditions. However, it faces significant challenges due to domain shifts caused by variations in sample sources (e.g., blood or bone marrow) and ...
Pratibha Kumari +6 more
doaj +2 more sources
Variational Continuous Bayesian Meta-learning Based Algorithm for Recommendation [PDF]
Meta-learning methods have been introduced into recommendation algorithms in recent years to alleviate the problem of cold start.The existing meta-learning algorithms can only improve the ability of the algorithm to deal with a set of statically ...
ZHU Wentao, LIU Wei, LIANG Shangsong, ZHU Huaijie, YIN Jian
doaj +1 more source
Do not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting [PDF]
One of the major limitations of deep learning models is that they face catastrophic forgetting in an incremental learning scenario. There have been several approaches proposed to tackle the problem of incremental learning. Most of these methods are based on knowledge distillation and do not adequately utilize the information provided by older task ...
Kurmi, Vinod K +3 more
openaire +2 more sources
Incremental Fault Diagnosis Method Based on Metric Feature Distillation and Improved Sample Memory
Incremental learning-based fault diagnosis systems (IFD) are widely used because of their ability to handle constantly updated fault data and types. However, the catastrophic forgetting problem remains the most crucial contemporary challenge facing IFD ...
Qilang Min +3 more
doaj +1 more source
On Sequential Bayesian Inference for Continual Learning
Sequential Bayesian inference can be used for continual learning to prevent catastrophic forgetting of past tasks and provide an informative prior when learning new tasks.
Samuel Kessler +4 more
doaj +1 more source
Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks
Catastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the
Jonghong Kim +4 more
doaj +1 more source
Embodiment can combat catastrophic forgetting [PDF]
We use an evolutionary robotics approach to demonstrate how the choice of robot morphology can affect one specific aspect of neural networks: their ability to resist catastrophic forgetting.
Joshua P. Powers +2 more
openaire +1 more source
Explain to Not Forget: Defending Against Catastrophic Forgetting with XAI
14 pages including appendix, 5 figures, 2 tables, 1 algorithm listing. v2 update increases figure readability, updates Fig 5 caption, adds our collaborators Dario and An as co-authors v3 brings the preprint in line with the final version accepted for peer-reviewed publication at CD-MAKE 2022.
Sami Ede +6 more
openaire +3 more sources
Enhancing network modularity to mitigate catastrophic forgetting
Catastrophic forgetting occurs when learning algorithms change connections used to encode previously acquired skills to learn a new skill. Recently, a modular approach for neural networks was deemed necessary as learning problems grow in scale and ...
Lu Chen, Masayuki Murata
doaj +1 more source
Learning Without Forgetting: A New Framework for Network Cyber Security Threat Detection
Progressive learning addresses the problem of incrementally learning new tasks without compromising the prediction accuracy of previously learned tasks. In the context of artificial neural networks, several algorithms exist for achieving the progressive ...
Rupesh Raj Karn +2 more
doaj +1 more source

