Results 21 to 30 of about 33,100 (246)

Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks

open access: yesSensors, 2023
Catastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the
Jonghong Kim   +4 more
doaj   +1 more source

Catastrophic Forgetting in the Context of Model Updates

open access: yesCoRR, 2023
A large obstacle to deploying deep learning models in practice is the process of updating models post-deployment (ideally, frequently). Deep neural networks can cost many thousands of dollars to train. When new data comes in the pipeline, you can train a new model from scratch (randomly initialized weights) on all existing data.
Richard E. Harang, Hillary Sanders
openaire   +2 more sources

Explain to Not Forget: Defending Against Catastrophic Forgetting with XAI

open access: yes, 2022
14 pages including appendix, 5 figures, 2 tables, 1 algorithm listing. v2 update increases figure readability, updates Fig 5 caption, adds our collaborators Dario and An as co-authors v3 brings the preprint in line with the final version accepted for peer-reviewed publication at CD-MAKE 2022.
Sami Ede   +6 more
openaire   +2 more sources

Enhancing network modularity to mitigate catastrophic forgetting

open access: yesApplied Network Science, 2020
Catastrophic forgetting occurs when learning algorithms change connections used to encode previously acquired skills to learn a new skill. Recently, a modular approach for neural networks was deemed necessary as learning problems grow in scale and ...
Lu Chen, Masayuki Murata
doaj   +1 more source

Learning Without Forgetting: A New Framework for Network Cyber Security Threat Detection

open access: yesIEEE Access, 2021
Progressive learning addresses the problem of incrementally learning new tasks without compromising the prediction accuracy of previously learned tasks. In the context of artificial neural networks, several algorithms exist for achieving the progressive ...
Rupesh Raj Karn   +2 more
doaj   +1 more source

Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation

open access: yesIEEE Access, 2022
Restricted by the cost of generating labels for training, semi-supervised methods have been applied to semantic segmentation tasks and have achieved varying degrees of success. Recently, the semi-supervised learning method has taken pseudo supervision as
Yan Zhou   +4 more
doaj   +1 more source

Overcoming Catastrophic Forgetting by XAI

open access: yesCoRR, 2022
Explaining the behaviors of deep neural networks, usually considered as black boxes, is critical especially when they are now being adopted over diverse aspects of human life. Taking the advantages of interpretable machine learning (interpretable ML), this work proposes a novel tool called Catastrophic Forgetting Dissector (or CFD) to explain ...
openaire   +2 more sources

A divided and prioritized experience replay approach for streaming regression

open access: yesMethodsX, 2021
In the streaming learning setting, an agent is presented with a data stream on which to learn from in an online fashion. A common problem is catastrophic forgetting of old knowledge due to updates to the model.
Mikkel Leite Arnø   +2 more
doaj   +1 more source

On Robustness of Generative Representations Against Catastrophic Forgetting [PDF]

open access: yes, 2021
Catastrophic forgetting of previously learned knowledge while learning new tasks is a widely observed limitation of contemporary neural networks. Although many continual learning methods are proposed to mitigate this drawback, the main question remains unanswered: what is the root cause of catastrophic forgetting? In this work, we aim at answering this
Wojciech Masarczyk   +2 more
openaire   +3 more sources

Behavioral Experiments for Understanding Catastrophic Forgetting

open access: yesCoRR, 2021
In this paper we explore whether the fundamental tool of experimental psychology, the behavioral experiment, has the power to generate insight not only into humans and animals, but artificial systems too. We apply the techniques of experimental psychology to investigating catastrophic forgetting in neural networks.
Samuel J. Bell, Neil D. Lawrence
openaire   +2 more sources

Home - About - Disclaimer - Privacy